0
stringclasses 12
values | 1
float64 0
4.34k
|
---|---|
megatron.core.transformer.attention.forward.linear_proj
| 0.180256 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.44016 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.06752 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.762528 |
megatron.core.transformer.mlp.forward.activation
| 0.093184 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.695392 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.572 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.06768 |
megatron.core.transformer.attention.forward.qkv
| 0.34224 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005248 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 10.869312 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180448 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.435232 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067616 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.76208 |
megatron.core.transformer.mlp.forward.activation
| 0.092672 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.696512 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.5728 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068 |
megatron.core.transformer.attention.forward.qkv
| 0.34208 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00544 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005312 |
megatron.core.transformer.attention.forward.core_attention
| 10.851392 |
megatron.core.transformer.attention.forward.linear_proj
| 0.179712 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.415744 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067712 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.763232 |
megatron.core.transformer.mlp.forward.activation
| 0.091744 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.696704 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.572576 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067424 |
megatron.core.transformer.attention.forward.qkv
| 0.343136 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005248 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005664 |
megatron.core.transformer.attention.forward.core_attention
| 10.859616 |
megatron.core.transformer.attention.forward.linear_proj
| 0.18 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.425408 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068576 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.764128 |
megatron.core.transformer.mlp.forward.activation
| 0.093152 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.693312 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.572192 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067264 |
megatron.core.transformer.attention.forward.qkv
| 0.341856 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005216 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 10.898752 |
megatron.core.transformer.attention.forward.linear_proj
| 0.181216 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.464224 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067168 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.763904 |
megatron.core.transformer.mlp.forward.activation
| 0.093824 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694304 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.57312 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068352 |
megatron.core.transformer.attention.forward.qkv
| 0.590464 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.134784 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.099712 |
megatron.core.transformer.attention.forward.core_attention
| 5.843584 |
megatron.core.transformer.attention.forward.linear_proj
| 0.181984 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 7.18368 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.028672 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.752736 |
megatron.core.transformer.mlp.forward.activation
| 0.092896 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.686112 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.551776 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.029664 |
megatron.core.transformer.attention.forward.qkv
| 0.340192 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005088 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 3.787488 |
megatron.core.transformer.attention.forward.linear_proj
| 0.173728 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4.34256 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.028928 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.751264 |
megatron.core.transformer.mlp.forward.activation
| 0.093568 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.682144 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.54736 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.02944 |
megatron.core.transformer.attention.forward.qkv
| 0.341888 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005056 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005088 |
megatron.core.transformer.attention.forward.core_attention
| 3.152608 |
megatron.core.transformer.attention.forward.linear_proj
| 0.173184 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.708736 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.029344 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.75024 |
megatron.core.transformer.mlp.forward.activation
| 0.092512 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.68272 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.545824 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.028384 |
megatron.core.transformer.attention.forward.qkv
| 0.339136 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00512 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00512 |
megatron.core.transformer.attention.forward.core_attention
| 3.214272 |
megatron.core.transformer.attention.forward.linear_proj
| 0.173472 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.767744 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.029184 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.748864 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.