0
stringclasses 12
values | 1
float64 0
4.34k
|
---|---|
megatron.core.transformer.attention.forward.qkv
| 0.627552 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.082464 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.093792 |
megatron.core.transformer.attention.forward.core_attention
| 10.724224 |
megatron.core.transformer.attention.forward.linear_proj
| 0.181376 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.985408 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.029216 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.753344 |
megatron.core.transformer.mlp.forward.activation
| 0.092416 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.688864 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.555136 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.029632 |
megatron.core.transformer.attention.forward.qkv
| 0.340256 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00496 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.004896 |
megatron.core.transformer.attention.forward.core_attention
| 3.786112 |
megatron.core.transformer.attention.forward.linear_proj
| 0.174272 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4.340672 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.028512 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.752128 |
megatron.core.transformer.mlp.forward.activation
| 0.092448 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.685248 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.549568 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.028544 |
megatron.core.transformer.attention.forward.qkv
| 0.339808 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.004864 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.004864 |
megatron.core.transformer.attention.forward.core_attention
| 3.15088 |
megatron.core.transformer.attention.forward.linear_proj
| 0.173248 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.703424 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.029984 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.751712 |
megatron.core.transformer.mlp.forward.activation
| 0.092928 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.683168 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.547968 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.028608 |
megatron.core.transformer.attention.forward.qkv
| 0.33872 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005088 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.004896 |
megatron.core.transformer.attention.forward.core_attention
| 3.216736 |
megatron.core.transformer.attention.forward.linear_proj
| 0.173792 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.769376 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.028672 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.749664 |
megatron.core.transformer.mlp.forward.activation
| 0.092704 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.683392 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.545984 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.028768 |
megatron.core.transformer.attention.forward.qkv
| 0.341504 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005056 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.004992 |
megatron.core.transformer.attention.forward.core_attention
| 3.247072 |
megatron.core.transformer.attention.forward.linear_proj
| 0.173664 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.803104 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.029664 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.749888 |
megatron.core.transformer.mlp.forward.activation
| 0.093152 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.684256 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.54752 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.028832 |
megatron.core.transformer.attention.forward.qkv
| 0.338656 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00512 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005024 |
megatron.core.transformer.attention.forward.core_attention
| 3.389472 |
megatron.core.transformer.attention.forward.linear_proj
| 0.172704 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.941216 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.028832 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.751552 |
megatron.core.transformer.mlp.forward.activation
| 0.093696 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.682848 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.548064 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.028864 |
megatron.core.transformer.attention.forward.qkv
| 0.341344 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005024 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005024 |
megatron.core.transformer.attention.forward.core_attention
| 3.098752 |
megatron.core.transformer.attention.forward.linear_proj
| 0.172544 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.653216 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.028448 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.748128 |
megatron.core.transformer.mlp.forward.activation
| 0.092224 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.684192 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.544704 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.029536 |
megatron.core.transformer.attention.forward.qkv
| 0.338688 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005056 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005184 |
megatron.core.transformer.attention.forward.core_attention
| 3.108224 |
megatron.core.transformer.attention.forward.linear_proj
| 0.173056 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.660448 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.0288 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.750336 |
megatron.core.transformer.mlp.forward.activation
| 0.092512 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.683872 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.546656 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.028576 |
megatron.core.transformer.attention.forward.qkv
| 0.337888 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005056 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005056 |
megatron.core.transformer.attention.forward.core_attention
| 3.165696 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.