0
stringclasses 12
values | 1
float64 0
55.9k
|
---|---|
megatron.core.transformer.attention.forward.linear_proj
| 5.807552 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.182624 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.065984 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.118656 |
megatron.core.transformer.mlp.forward.activation
| 0.01776 |
megatron.core.transformer.mlp.forward.linear_fc2
| 19.337473 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 19.486176 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.065408 |
megatron.core.transformer.attention.forward.qkv
| 260.386078 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.13504 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.108736 |
megatron.core.transformer.attention.forward.core_attention
| 1,087.432129 |
megatron.core.transformer.attention.forward.linear_proj
| 49.435455 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,398.363037 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,457.490845 |
megatron.core.transformer.mlp.forward.linear_fc1
| 3.78688 |
megatron.core.transformer.mlp.forward.activation
| 561.778625 |
megatron.core.transformer.mlp.forward.linear_fc2
| 29.149055 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 595.332153 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.234496 |
megatron.core.transformer.attention.forward.qkv
| 0.243808 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00304 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944 |
megatron.core.transformer.attention.forward.core_attention
| 1.416352 |
megatron.core.transformer.attention.forward.linear_proj
| 18.400415 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 20.084288 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.23344 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.435584 |
megatron.core.transformer.mlp.forward.activation
| 3.79968 |
megatron.core.transformer.mlp.forward.linear_fc2
| 44.700993 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 49.010559 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.234624 |
megatron.core.transformer.attention.forward.qkv
| 215.227707 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.124192 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.09536 |
megatron.core.transformer.attention.forward.core_attention
| 1,073.21228 |
megatron.core.transformer.attention.forward.linear_proj
| 85.00531 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,378.664795 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,580.559937 |
megatron.core.transformer.mlp.forward.linear_fc1
| 5.042432 |
megatron.core.transformer.mlp.forward.activation
| 601.755554 |
megatron.core.transformer.mlp.forward.linear_fc2
| 70.337662 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 677.445618 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.895488 |
megatron.core.transformer.attention.forward.qkv
| 0.936224 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003328 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304 |
megatron.core.transformer.attention.forward.core_attention
| 9.669056 |
megatron.core.transformer.attention.forward.linear_proj
| 7.93472 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 18.565248 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.894848 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.801568 |
megatron.core.transformer.mlp.forward.activation
| 0.17152 |
megatron.core.transformer.mlp.forward.linear_fc2
| 3.524256 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 5.509248 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.893408 |
megatron.core.transformer.attention.forward.qkv
| 235.232742 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.115712 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.09824 |
megatron.core.transformer.attention.forward.core_attention
| 3,760.929688 |
megatron.core.transformer.attention.forward.linear_proj
| 13.7136 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4,012.000977 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 221.288345 |
megatron.core.transformer.mlp.forward.linear_fc1
| 3.700768 |
megatron.core.transformer.mlp.forward.activation
| 190.513306 |
megatron.core.transformer.mlp.forward.linear_fc2
| 8.09312 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 203.102692 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.016992 |
megatron.core.transformer.attention.forward.qkv
| 0.024832 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003008 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003008 |
megatron.core.transformer.attention.forward.core_attention
| 0.103392 |
megatron.core.transformer.attention.forward.linear_proj
| 10.92288 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.075232 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.015872 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.02624 |
megatron.core.transformer.mlp.forward.activation
| 0.00704 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.750016 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.79536 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.017536 |
megatron.core.transformer.attention.forward.qkv
| 241.969025 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 4,167.08252 |
megatron.core.transformer.attention.forward.linear_proj
| 7.87648 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4,421.959961 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,622.684814 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.201856 |
megatron.core.transformer.mlp.forward.activation
| 608.938293 |
megatron.core.transformer.mlp.forward.linear_fc2
| 67.623329 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 682.776855 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 1.773024 |
megatron.core.transformer.attention.forward.qkv
| 1.907488 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002912 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 39.563873 |
megatron.core.transformer.attention.forward.linear_proj
| 18.974367 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 60.469345 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1.772832 |
megatron.core.transformer.mlp.forward.linear_fc1
| 3.63344 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.