0
stringclasses 12
values | 1
float64 0
4.34k
|
---|---|
megatron.core.transformer.attention.forward.qkv
| 0.34256 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005344 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005568 |
megatron.core.transformer.attention.forward.core_attention
| 10.888832 |
megatron.core.transformer.attention.forward.linear_proj
| 0.18016 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.455136 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068128 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.761824 |
megatron.core.transformer.mlp.forward.activation
| 0.094464 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694944 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.57216 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067712 |
megatron.core.transformer.attention.forward.qkv
| 0.341728 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005536 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 10.861344 |
megatron.core.transformer.attention.forward.linear_proj
| 0.179808 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.425664 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068512 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.762016 |
megatron.core.transformer.mlp.forward.activation
| 0.093792 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694848 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.572096 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068192 |
megatron.core.transformer.attention.forward.qkv
| 0.342048 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005216 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005472 |
megatron.core.transformer.attention.forward.core_attention
| 10.868928 |
megatron.core.transformer.attention.forward.linear_proj
| 0.181408 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.435072 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067936 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.762592 |
megatron.core.transformer.mlp.forward.activation
| 0.09216 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694336 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.57008 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068128 |
megatron.core.transformer.attention.forward.qkv
| 0.344288 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005312 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 10.875232 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180448 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.442656 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068192 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.7632 |
megatron.core.transformer.mlp.forward.activation
| 0.092864 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.697248 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.574464 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068224 |
megatron.core.transformer.attention.forward.qkv
| 0.342816 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00528 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005504 |
megatron.core.transformer.attention.forward.core_attention
| 10.862208 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180832 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.42848 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.06832 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.762944 |
megatron.core.transformer.mlp.forward.activation
| 0.092224 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.695264 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.571776 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067872 |
megatron.core.transformer.attention.forward.qkv
| 0.345312 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00544 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005376 |
megatron.core.transformer.attention.forward.core_attention
| 10.86128 |
megatron.core.transformer.attention.forward.linear_proj
| 0.18048 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.43008 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068352 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.763744 |
megatron.core.transformer.mlp.forward.activation
| 0.093984 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694208 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.573024 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068448 |
megatron.core.transformer.attention.forward.qkv
| 0.342624 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00528 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005312 |
megatron.core.transformer.attention.forward.core_attention
| 10.86416 |
megatron.core.transformer.attention.forward.linear_proj
| 0.179904 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.429152 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067712 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.763712 |
megatron.core.transformer.mlp.forward.activation
| 0.092992 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694464 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.571936 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068448 |
megatron.core.transformer.attention.forward.qkv
| 0.343136 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005568 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005344 |
megatron.core.transformer.attention.forward.core_attention
| 10.868608 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180544 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.434368 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067232 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.764512 |
megatron.core.transformer.mlp.forward.activation
| 0.093888 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.696256 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.575968 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067616 |
megatron.core.transformer.attention.forward.qkv
| 0.68704 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.080928 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.102272 |
megatron.core.transformer.attention.forward.core_attention
| 17.294336 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.