0
stringclasses 12
values | 1
float64 0
55.9k
|
---|---|
megatron.core.transformer.attention.forward.linear_proj
| 7.94192 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2,971.435059 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.121312 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.767936 |
megatron.core.transformer.mlp.forward.activation
| 0.088288 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.955136 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.823264 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.1208 |
megatron.core.transformer.attention.forward.qkv
| 188.774918 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.104288 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.10896 |
megatron.core.transformer.attention.forward.core_attention
| 911.944336 |
megatron.core.transformer.attention.forward.linear_proj
| 10.957888 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,113.166748 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 268.272919 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.070272 |
megatron.core.transformer.mlp.forward.activation
| 189.18605 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.628768 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 192.318527 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.408704 |
megatron.core.transformer.attention.forward.qkv
| 0.533888 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.082528 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.095744 |
megatron.core.transformer.attention.forward.core_attention
| 0.910912 |
megatron.core.transformer.attention.forward.linear_proj
| 0.499328 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2.624576 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.385376 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.329536 |
megatron.core.transformer.mlp.forward.activation
| 0.20304 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.664512 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.417792 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.345824 |
megatron.core.transformer.attention.forward.qkv
| 192.18573 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003232 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304 |
megatron.core.transformer.attention.forward.core_attention
| 1,352.739624 |
megatron.core.transformer.attention.forward.linear_proj
| 11.817312 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,557.538574 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,297.921143 |
megatron.core.transformer.mlp.forward.linear_fc1
| 12.289984 |
megatron.core.transformer.mlp.forward.activation
| 502.551971 |
megatron.core.transformer.mlp.forward.linear_fc2
| 101.347458 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 616.203003 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 3.58032 |
megatron.core.transformer.attention.forward.qkv
| 6.20896 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00304 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304 |
megatron.core.transformer.attention.forward.core_attention
| 147.391617 |
megatron.core.transformer.attention.forward.linear_proj
| 11.971424 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 165.596222 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 3.561632 |
megatron.core.transformer.mlp.forward.linear_fc1
| 12.775616 |
megatron.core.transformer.mlp.forward.activation
| 1.323872 |
megatron.core.transformer.mlp.forward.linear_fc2
| 20.66432 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 34.776066 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 3.89104 |
megatron.core.transformer.attention.forward.qkv
| 194.427002 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.110592 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.087936 |
megatron.core.transformer.attention.forward.core_attention
| 888.976685 |
megatron.core.transformer.attention.forward.linear_proj
| 8.737216 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,093.337158 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,171.893555 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.053632 |
megatron.core.transformer.mlp.forward.activation
| 462.504028 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.907744 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 465.375336 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.511776 |
megatron.core.transformer.attention.forward.qkv
| 0.532256 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.077344 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.084832 |
megatron.core.transformer.attention.forward.core_attention
| 1.035008 |
megatron.core.transformer.attention.forward.linear_proj
| 0.710496 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2.777376 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.57088 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.600064 |
megatron.core.transformer.mlp.forward.activation
| 0.43568 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.652832 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.79648 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 1.512416 |
megatron.core.transformer.attention.forward.qkv
| 216.019043 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.128704 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.08288 |
megatron.core.transformer.attention.forward.core_attention
| 965.195313 |
megatron.core.transformer.attention.forward.linear_proj
| 4.242624 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,186.639404 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,248.421387 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.51888 |
megatron.core.transformer.mlp.forward.activation
| 483.644073 |
megatron.core.transformer.mlp.forward.linear_fc2
| 2.033568 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 487.912445 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.234816 |
megatron.core.transformer.attention.forward.qkv
| 0.396096 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.028448 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.10608 |
megatron.core.transformer.attention.forward.core_attention
| 3.43968 |
megatron.core.transformer.attention.forward.linear_proj
| 0.810816 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4.947232 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.235232 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.824032 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.