0
stringclasses 12
values | 1
float64 0
4.34k
|
---|---|
megatron.core.transformer.attention.forward.qkv
| 0.341472 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005216 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005408 |
megatron.core.transformer.attention.forward.core_attention
| 5.340576 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180736 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5.904672 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068192 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.761152 |
megatron.core.transformer.mlp.forward.activation
| 0.092928 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.696288 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.571392 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067488 |
megatron.core.transformer.attention.forward.qkv
| 0.342592 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005504 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005184 |
megatron.core.transformer.attention.forward.core_attention
| 5.313152 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180896 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5.878784 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067616 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.761408 |
megatron.core.transformer.mlp.forward.activation
| 0.093536 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.692064 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.569312 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.069888 |
megatron.core.transformer.attention.forward.qkv
| 0.342624 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005184 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005216 |
megatron.core.transformer.attention.forward.core_attention
| 5.330944 |
megatron.core.transformer.attention.forward.linear_proj
| 0.181408 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5.897696 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.06736 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.76208 |
megatron.core.transformer.mlp.forward.activation
| 0.092512 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.692672 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.567808 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.06784 |
megatron.core.transformer.attention.forward.qkv
| 0.341664 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005184 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005216 |
megatron.core.transformer.attention.forward.core_attention
| 5.324672 |
megatron.core.transformer.attention.forward.linear_proj
| 0.181056 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5.888928 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.06768 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.762592 |
megatron.core.transformer.mlp.forward.activation
| 0.09248 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694496 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.570432 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.06768 |
megatron.core.transformer.attention.forward.qkv
| 0.34288 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005504 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005472 |
megatron.core.transformer.attention.forward.core_attention
| 5.3208 |
megatron.core.transformer.attention.forward.linear_proj
| 0.18144 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5.888192 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.06752 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.763232 |
megatron.core.transformer.mlp.forward.activation
| 0.093504 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.696064 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.573568 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067008 |
megatron.core.transformer.attention.forward.qkv
| 0.342848 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005376 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005184 |
megatron.core.transformer.attention.forward.core_attention
| 5.331488 |
megatron.core.transformer.attention.forward.linear_proj
| 0.18096 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5.896704 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068256 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.761472 |
megatron.core.transformer.mlp.forward.activation
| 0.092832 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.69296 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.567712 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068448 |
megatron.core.transformer.attention.forward.qkv
| 0.342688 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005504 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005216 |
megatron.core.transformer.attention.forward.core_attention
| 5.322656 |
megatron.core.transformer.attention.forward.linear_proj
| 0.181152 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5.888672 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068096 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.76112 |
megatron.core.transformer.mlp.forward.activation
| 0.092768 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694624 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.569568 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.06832 |
megatron.core.transformer.attention.forward.qkv
| 0.341888 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005568 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005184 |
megatron.core.transformer.attention.forward.core_attention
| 5.314624 |
megatron.core.transformer.attention.forward.linear_proj
| 0.181344 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5.87968 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.06752 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.761984 |
megatron.core.transformer.mlp.forward.activation
| 0.093152 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.69632 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.572704 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067616 |
megatron.core.transformer.attention.forward.qkv
| 0.341504 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005472 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005376 |
megatron.core.transformer.attention.forward.core_attention
| 5.32192 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.