0
stringclasses 12
values | 1
float64 0
4.34k
|
---|---|
megatron.core.transformer.attention.forward.linear_proj
| 0.181696 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 18.682465 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067616 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.752576 |
megatron.core.transformer.mlp.forward.activation
| 0.091968 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.692512 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.558528 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067808 |
megatron.core.transformer.attention.forward.qkv
| 0.34144 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005312 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 10.866528 |
megatron.core.transformer.attention.forward.linear_proj
| 0.179168 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.429152 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.06784 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.759104 |
megatron.core.transformer.mlp.forward.activation
| 0.09344 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.69504 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.568544 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067648 |
megatron.core.transformer.attention.forward.qkv
| 0.340704 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005312 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005248 |
megatron.core.transformer.attention.forward.core_attention
| 10.86336 |
megatron.core.transformer.attention.forward.linear_proj
| 0.178496 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.4248 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068096 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.761536 |
megatron.core.transformer.mlp.forward.activation
| 0.092544 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.691552 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.567328 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067456 |
megatron.core.transformer.attention.forward.qkv
| 0.340768 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005312 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005312 |
megatron.core.transformer.attention.forward.core_attention
| 10.85872 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180672 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.423328 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068288 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.761504 |
megatron.core.transformer.mlp.forward.activation
| 0.092704 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694976 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.57072 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067872 |
megatron.core.transformer.attention.forward.qkv
| 0.343744 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00528 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 10.866848 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180608 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.433472 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068768 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.761824 |
megatron.core.transformer.mlp.forward.activation
| 0.09408 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694144 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.571328 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067744 |
megatron.core.transformer.attention.forward.qkv
| 0.341632 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005216 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 10.8768 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180416 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.441536 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.06832 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.762112 |
megatron.core.transformer.mlp.forward.activation
| 0.09264 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694016 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.569728 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068064 |
megatron.core.transformer.attention.forward.qkv
| 0.34256 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00528 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005472 |
megatron.core.transformer.attention.forward.core_attention
| 10.867872 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180544 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.433152 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067264 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.763296 |
megatron.core.transformer.mlp.forward.activation
| 0.093568 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.695456 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.573504 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067648 |
megatron.core.transformer.attention.forward.qkv
| 0.343648 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005344 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005216 |
megatron.core.transformer.attention.forward.core_attention
| 10.891488 |
megatron.core.transformer.attention.forward.linear_proj
| 0.179712 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.457408 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.068064 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.76352 |
megatron.core.transformer.mlp.forward.activation
| 0.092352 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.696512 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.573856 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067712 |
megatron.core.transformer.attention.forward.qkv
| 0.343744 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00528 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 10.857248 |
megatron.core.transformer.attention.forward.linear_proj
| 0.18176 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.425184 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067648 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.764288 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.