0
stringclasses 12
values | 1
float64 0
4.34k
|
---|---|
megatron.core.transformer.attention.forward.linear_proj
| 0.173568 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.717472 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.028352 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.751296 |
megatron.core.transformer.mlp.forward.activation
| 0.093504 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.6848 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.549568 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.029024 |
megatron.core.transformer.attention.forward.qkv
| 0.34032 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.004992 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005024 |
megatron.core.transformer.attention.forward.core_attention
| 3.155104 |
megatron.core.transformer.attention.forward.linear_proj
| 0.17344 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.709568 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.028672 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.748704 |
megatron.core.transformer.mlp.forward.activation
| 0.093056 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.681472 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.543584 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.029184 |
megatron.core.transformer.attention.forward.qkv
| 0.337376 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005024 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005376 |
megatron.core.transformer.attention.forward.core_attention
| 3.255712 |
megatron.core.transformer.attention.forward.linear_proj
| 0.174048 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.807744 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.028992 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.748096 |
megatron.core.transformer.mlp.forward.activation
| 0.09312 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.682848 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.544256 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.029056 |
megatron.core.transformer.attention.forward.qkv
| 0.341216 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005408 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005088 |
megatron.core.transformer.attention.forward.core_attention
| 3.663072 |
megatron.core.transformer.attention.forward.linear_proj
| 0.171872 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4.21696 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.028896 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.749056 |
megatron.core.transformer.mlp.forward.activation
| 0.092064 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.683232 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.544736 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.02976 |
megatron.core.transformer.attention.forward.qkv
| 0.605824 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.073248 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.088768 |
megatron.core.transformer.attention.forward.core_attention
| 3.144 |
megatron.core.transformer.attention.forward.linear_proj
| 0.354944 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4.672896 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.315456 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.97072 |
megatron.core.transformer.mlp.forward.activation
| 0.092832 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.69664 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.820864 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.029408 |
megatron.core.transformer.attention.forward.qkv
| 0.340384 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.004896 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.004864 |
megatron.core.transformer.attention.forward.core_attention
| 2.515776 |
megatron.core.transformer.attention.forward.linear_proj
| 0.325568 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.358432 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.22192 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.901536 |
megatron.core.transformer.mlp.forward.activation
| 0.093344 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.69408 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.75168 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.028224 |
megatron.core.transformer.attention.forward.qkv
| 0.340512 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.004864 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.004928 |
megatron.core.transformer.attention.forward.core_attention
| 2.351552 |
megatron.core.transformer.attention.forward.linear_proj
| 0.326656 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.198304 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.22528 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.900992 |
megatron.core.transformer.mlp.forward.activation
| 0.092736 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.691424 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.742208 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.028416 |
megatron.core.transformer.attention.forward.qkv
| 0.34112 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.004896 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.004896 |
megatron.core.transformer.attention.forward.core_attention
| 2.422624 |
megatron.core.transformer.attention.forward.linear_proj
| 0.327648 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.268448 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.263616 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.905248 |
megatron.core.transformer.mlp.forward.activation
| 0.094048 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.695808 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.754912 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.029408 |
megatron.core.transformer.attention.forward.qkv
| 0.338976 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.004896 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00496 |
megatron.core.transformer.attention.forward.core_attention
| 2.547904 |
megatron.core.transformer.attention.forward.linear_proj
| 0.332096 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.416544 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.226752 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.90208 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.