0
stringclasses 12
values | 1
float64 0
4.34k
|
---|---|
megatron.core.transformer.attention.forward.linear_proj
| 0.18064 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.439552 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067264 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.763392 |
megatron.core.transformer.mlp.forward.activation
| 0.0936 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.696288 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.573952 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067872 |
megatron.core.transformer.attention.forward.qkv
| 0.341696 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005504 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 10.881952 |
megatron.core.transformer.attention.forward.linear_proj
| 0.181088 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.44736 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067456 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.762944 |
megatron.core.transformer.mlp.forward.activation
| 0.092672 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.69488 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.571712 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067904 |
megatron.core.transformer.attention.forward.qkv
| 0.346336 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00544 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 43.372097 |
megatron.core.transformer.attention.forward.linear_proj
| 0.179808 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 43.94128 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067264 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.762592 |
megatron.core.transformer.mlp.forward.activation
| 0.093696 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.69504 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.572704 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067904 |
megatron.core.transformer.attention.forward.qkv
| 0.343328 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005376 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 37.985825 |
megatron.core.transformer.attention.forward.linear_proj
| 0.182112 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 38.553886 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067936 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.765792 |
megatron.core.transformer.mlp.forward.activation
| 0.093536 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694208 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.574688 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067744 |
megatron.core.transformer.attention.forward.qkv
| 0.670592 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.08176 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.08832 |
megatron.core.transformer.attention.forward.core_attention
| 65.575203 |
megatron.core.transformer.attention.forward.linear_proj
| 0.183264 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 66.925247 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.06752 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.756832 |
megatron.core.transformer.mlp.forward.activation
| 0.094336 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.691616 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.564096 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.06816 |
megatron.core.transformer.attention.forward.qkv
| 0.342528 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005216 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005632 |
megatron.core.transformer.attention.forward.core_attention
| 10.853536 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180256 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.418624 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067776 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.760384 |
megatron.core.transformer.mlp.forward.activation
| 0.092512 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.695584 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.569344 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.067488 |
megatron.core.transformer.attention.forward.qkv
| 0.342464 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005632 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00528 |
megatron.core.transformer.attention.forward.core_attention
| 10.870048 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180928 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.436064 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067104 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.760544 |
megatron.core.transformer.mlp.forward.activation
| 0.092384 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.695328 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.569152 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068096 |
megatron.core.transformer.attention.forward.qkv
| 0.343968 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00528 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005312 |
megatron.core.transformer.attention.forward.core_attention
| 10.858496 |
megatron.core.transformer.attention.forward.linear_proj
| 0.180736 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.425792 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067264 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.76336 |
megatron.core.transformer.mlp.forward.activation
| 0.093184 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.694944 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.572416 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.068448 |
megatron.core.transformer.attention.forward.qkv
| 0.341696 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.005216 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.005312 |
megatron.core.transformer.attention.forward.core_attention
| 10.853696 |
megatron.core.transformer.attention.forward.linear_proj
| 0.179392 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.417856 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.067424 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.76272 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.