0
stringclasses 12
values | 1
float64 0
4.34k
|
---|---|
megatron.core.transformer.attention.forward.qkv
| 0.341664 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005344 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005184 |
megatron.core.transformer.attention.forward.core_attention
| 10.865888 |
megatron.core.transformer.attention.forward.linear_proj
| 0.179808 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.429408 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067808 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.762304 |
megatron.core.transformer.mlp.forward.activation
| 0.091936 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.693696 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.568992 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067904 |
megatron.core.transformer.attention.forward.qkv
| 0.341408 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00528 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 10.875456 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180032 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.439616 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067808 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.760256 |
megatron.core.transformer.mlp.forward.activation
| 0.093472 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.693152 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.56816 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067968 |
megatron.core.transformer.attention.forward.qkv
| 0.340512 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005344 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005312 |
megatron.core.transformer.attention.forward.core_attention
| 10.869024 |
megatron.core.transformer.attention.forward.linear_proj
| 0.179744 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.43184 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068832 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.762176 |
megatron.core.transformer.mlp.forward.activation
| 0.092736 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.69392 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.569824 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068288 |
megatron.core.transformer.attention.forward.qkv
| 0.342336 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005248 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 10.877504 |
megatron.core.transformer.attention.forward.linear_proj
| 0.179776 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.441824 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067616 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.761664 |
megatron.core.transformer.mlp.forward.activation
| 0.093472 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.693792 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.570144 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067648 |
megatron.core.transformer.attention.forward.qkv
| 0.342176 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00528 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005216 |
megatron.core.transformer.attention.forward.core_attention
| 10.87168 |
megatron.core.transformer.attention.forward.linear_proj
| 0.18016 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.436064 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068512 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.76176 |
megatron.core.transformer.mlp.forward.activation
| 0.092192 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694624 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.569984 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068064 |
megatron.core.transformer.attention.forward.qkv
| 0.343616 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00528 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005216 |
megatron.core.transformer.attention.forward.core_attention
| 10.884 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180224 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.450144 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.06736 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.761696 |
megatron.core.transformer.mlp.forward.activation
| 0.094176 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.695776 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.573184 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068768 |
megatron.core.transformer.attention.forward.qkv
| 0.342368 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00528 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005216 |
megatron.core.transformer.attention.forward.core_attention
| 10.870784 |
megatron.core.transformer.attention.forward.linear_proj
| 0.18096 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.436576 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068448 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.761536 |
megatron.core.transformer.mlp.forward.activation
| 0.092896 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.695968 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.571296 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067584 |
megatron.core.transformer.attention.forward.qkv
| 0.341632 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00528 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005312 |
megatron.core.transformer.attention.forward.core_attention
| 10.85504 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180192 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.41968 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.06752 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.763488 |
megatron.core.transformer.mlp.forward.activation
| 0.092448 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.693536 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.570848 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068352 |
megatron.core.transformer.attention.forward.qkv
| 0.344224 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005312 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005248 |
megatron.core.transformer.attention.forward.core_attention
| 10.883936 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.