0
stringclasses 12
values | 1
float64 0
4.34k
|
---|---|
megatron.core.transformer.attention.forward.qkv
| 0.342368 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00528 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005216 |
megatron.core.transformer.attention.forward.core_attention
| 10.870784 |
megatron.core.transformer.attention.forward.linear_proj
| 0.18096 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.436576 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068448 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.761536 |
megatron.core.transformer.mlp.forward.activation
| 0.092896 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.695968 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.571296 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067584 |
megatron.core.transformer.attention.forward.qkv
| 0.341632 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00528 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005312 |
megatron.core.transformer.attention.forward.core_attention
| 10.85504 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180192 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.41968 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.06752 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.763488 |
megatron.core.transformer.mlp.forward.activation
| 0.092448 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.693536 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.570848 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068352 |
megatron.core.transformer.attention.forward.qkv
| 0.344224 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005312 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005248 |
megatron.core.transformer.attention.forward.core_attention
| 10.883936 |
megatron.core.transformer.attention.forward.linear_proj
| 0.179616 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.449952 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067808 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.761856 |
megatron.core.transformer.mlp.forward.activation
| 0.092 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694912 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.569504 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067936 |
megatron.core.transformer.attention.forward.qkv
| 0.341984 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005248 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005792 |
megatron.core.transformer.attention.forward.core_attention
| 10.869056 |
megatron.core.transformer.attention.forward.linear_proj
| 0.18176 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.435616 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068064 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.762272 |
megatron.core.transformer.mlp.forward.activation
| 0.092128 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694336 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.570016 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068352 |
megatron.core.transformer.attention.forward.qkv
| 0.72384 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.080544 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.089408 |
megatron.core.transformer.attention.forward.core_attention
| 13.73504 |
megatron.core.transformer.attention.forward.linear_proj
| 0.181632 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 15.165056 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067904 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.753792 |
megatron.core.transformer.mlp.forward.activation
| 0.093312 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.691904 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.560096 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067456 |
megatron.core.transformer.attention.forward.qkv
| 0.34096 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005344 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005344 |
megatron.core.transformer.attention.forward.core_attention
| 10.886688 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180704 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.450784 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.06752 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.760608 |
megatron.core.transformer.mlp.forward.activation
| 0.09216 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.69328 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.567552 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068416 |
megatron.core.transformer.attention.forward.qkv
| 0.344032 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005216 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 10.848 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180448 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.41472 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067488 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.762528 |
megatron.core.transformer.mlp.forward.activation
| 0.093472 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.691296 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.569024 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.06736 |
megatron.core.transformer.attention.forward.qkv
| 0.342304 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005248 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005376 |
megatron.core.transformer.attention.forward.core_attention
| 10.855488 |
megatron.core.transformer.attention.forward.linear_proj
| 0.17968 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.41952 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067552 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.763776 |
megatron.core.transformer.mlp.forward.activation
| 0.092352 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.69456 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.571648 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067744 |
megatron.core.transformer.attention.forward.qkv
| 0.34192 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005184 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 10.84848 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.