Dataset Viewer
0
stringclasses 12
values | 1
float64 0
55.9k
|
|---|---|
megatron.core.transformer.attention.forward.qkv
| 176.714844
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.100832
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.083936
|
megatron.core.transformer.attention.forward.core_attention
| 838.296509
|
megatron.core.transformer.attention.forward.linear_proj
| 3.306048
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,020.187622
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 234.378403
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.80816
|
megatron.core.transformer.mlp.forward.activation
| 156.001373
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.775136
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 158.650238
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.37152
|
megatron.core.transformer.attention.forward.qkv
| 0.430112
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.07056
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.077536
|
megatron.core.transformer.attention.forward.core_attention
| 1.235072
|
megatron.core.transformer.attention.forward.linear_proj
| 0.332736
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2.566816
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.30416
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.35888
|
megatron.core.transformer.mlp.forward.activation
| 0.120928
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.307872
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.892224
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.276064
|
megatron.core.transformer.attention.forward.qkv
| 171.820038
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.115648
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.082944
|
megatron.core.transformer.attention.forward.core_attention
| 832.88623
|
megatron.core.transformer.attention.forward.linear_proj
| 3.388544
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,009.967224
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,004.442322
|
megatron.core.transformer.mlp.forward.linear_fc1
| 3.855808
|
megatron.core.transformer.mlp.forward.activation
| 433.085449
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.371424
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 439.813507
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.310304
|
megatron.core.transformer.attention.forward.qkv
| 0.91984
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.071808
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.078496
|
megatron.core.transformer.attention.forward.core_attention
| 2.30944
|
megatron.core.transformer.attention.forward.linear_proj
| 0.181184
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.83472
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.065536
|
megatron.core.transformer.mlp.forward.linear_fc1
| 1.01808
|
megatron.core.transformer.mlp.forward.activation
| 0.087744
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.85808
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.976736
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.093568
|
megatron.core.transformer.attention.forward.qkv
| 173.967911
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976
|
megatron.core.transformer.attention.forward.core_attention
| 863.261658
|
megatron.core.transformer.attention.forward.linear_proj
| 0.744672
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,038.817261
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,058.488403
|
megatron.core.transformer.mlp.forward.linear_fc1
| 5.881504
|
megatron.core.transformer.mlp.forward.activation
| 472.481842
|
megatron.core.transformer.mlp.forward.linear_fc2
| 3.470496
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 482.59024
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.23184
|
megatron.core.transformer.attention.forward.qkv
| 1.325728
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944
|
megatron.core.transformer.attention.forward.core_attention
| 9.87104
|
megatron.core.transformer.attention.forward.linear_proj
| 0.698432
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.919488
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.231616
|
megatron.core.transformer.mlp.forward.linear_fc1
| 2.960064
|
megatron.core.transformer.mlp.forward.activation
| 0.333824
|
megatron.core.transformer.mlp.forward.linear_fc2
| 2.862208
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 6.16816
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.228928
|
megatron.core.transformer.attention.forward.qkv
| 171.95401
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003104
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304
|
megatron.core.transformer.attention.forward.core_attention
| 915.013794
|
megatron.core.transformer.attention.forward.linear_proj
| 2.684768
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,090.517578
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,033.266602
|
megatron.core.transformer.mlp.forward.linear_fc1
| 14.364896
|
megatron.core.transformer.mlp.forward.activation
| 472.968353
|
megatron.core.transformer.mlp.forward.linear_fc2
| 24.896223
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 513.041809
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.896896
|
megatron.core.transformer.attention.forward.qkv
| 5.119328
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00304
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304
|
megatron.core.transformer.attention.forward.core_attention
| 77.115486
|
megatron.core.transformer.attention.forward.linear_proj
| 2.659392
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 84.919426
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.902272
|
megatron.core.transformer.mlp.forward.linear_fc1
| 11.422784
|
megatron.core.transformer.mlp.forward.activation
| 1.328
|
megatron.core.transformer.mlp.forward.linear_fc2
| 11.5056
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 24.268385
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.902784
|
megatron.core.transformer.attention.forward.qkv
| 180.409241
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.111136
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.083072
|
megatron.core.transformer.attention.forward.core_attention
| 3,018.295898
|
End of preview. Expand
in Data Studio
No dataset card yet
- Downloads last month
- 130