0
stringclasses 12
values | 1
float64 0
55.9k
|
---|---|
megatron.core.transformer.attention.forward.linear_proj
| 0.029024 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3,009.008789 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.016768 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.103296 |
megatron.core.transformer.mlp.forward.activation
| 0.017504 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.0968 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.230496 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.016608 |
megatron.core.transformer.attention.forward.qkv
| 264.864746 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944 |
megatron.core.transformer.attention.forward.core_attention
| 10,987.004883 |
megatron.core.transformer.attention.forward.linear_proj
| 8.078208 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11,260.722656 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,606.737793 |
megatron.core.transformer.mlp.forward.linear_fc1
| 29.634111 |
megatron.core.transformer.mlp.forward.activation
| 696.639282 |
megatron.core.transformer.mlp.forward.linear_fc2
| 23.755072 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 750.042969 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 2.070592 |
megatron.core.transformer.attention.forward.qkv
| 10.462752 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002912 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944 |
megatron.core.transformer.attention.forward.core_attention
| 3,570.848633 |
megatron.core.transformer.attention.forward.linear_proj
| 5.34688 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3,586.682129 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1.769664 |
megatron.core.transformer.mlp.forward.linear_fc1
| 23.364191 |
megatron.core.transformer.mlp.forward.activation
| 3.199328 |
megatron.core.transformer.mlp.forward.linear_fc2
| 24.305183 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 50.880287 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 2.332672 |
megatron.core.transformer.attention.forward.qkv
| 225.697693 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.117632 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.108576 |
megatron.core.transformer.attention.forward.core_attention
| 4,543.787109 |
megatron.core.transformer.attention.forward.linear_proj
| 3.56128 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4,775.291504 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,550.083008 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.243168 |
megatron.core.transformer.mlp.forward.activation
| 585.292908 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.0616 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 588.755432 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.460928 |
megatron.core.transformer.attention.forward.qkv
| 0.605152 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.081984 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.094304 |
megatron.core.transformer.attention.forward.core_attention
| 2,675.952637 |
megatron.core.transformer.attention.forward.linear_proj
| 0.09344 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2,677.159912 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.038272 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.376256 |
megatron.core.transformer.mlp.forward.activation
| 0.04752 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.349248 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.784896 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.038432 |
megatron.core.transformer.attention.forward.qkv
| 236.058273 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.112672 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.092128 |
megatron.core.transformer.attention.forward.core_attention
| 9,198.258789 |
megatron.core.transformer.attention.forward.linear_proj
| 3.200064 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 9,442.643555 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,647.524414 |
megatron.core.transformer.mlp.forward.linear_fc1
| 4.87376 |
megatron.core.transformer.mlp.forward.activation
| 851.060974 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.967488 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 858.7005 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.50752 |
megatron.core.transformer.attention.forward.qkv
| 1.706112 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.086784 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.096704 |
megatron.core.transformer.attention.forward.core_attention
| 1,675.750732 |
megatron.core.transformer.attention.forward.linear_proj
| 0.343232 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,678.327637 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.120672 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.522944 |
megatron.core.transformer.mlp.forward.activation
| 0.168672 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.392736 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 3.096096 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.119648 |
megatron.core.transformer.attention.forward.qkv
| 186.042435 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.11392 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.079424 |
megatron.core.transformer.attention.forward.core_attention
| 844.653931 |
megatron.core.transformer.attention.forward.linear_proj
| 19.017792 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,050.946899 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 225.312607 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.894912 |
megatron.core.transformer.mlp.forward.activation
| 158.662567 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.074016 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 161.767365 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.391264 |
megatron.core.transformer.attention.forward.qkv
| 0.510752 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.069664 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.079456 |
megatron.core.transformer.attention.forward.core_attention
| 1.188896 |
megatron.core.transformer.attention.forward.linear_proj
| 0.48592 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2.754208 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.317824 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.29056 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.