0
stringclasses 12
values | 1
float64 0
55.9k
|
|---|---|
megatron.core.transformer.attention.forward.linear_proj
| 4.323296
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3,204.367432
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 190.233276
|
megatron.core.transformer.mlp.forward.linear_fc1
| 2.97536
|
megatron.core.transformer.mlp.forward.activation
| 153.171906
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.95008
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 157.858368
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.400096
|
megatron.core.transformer.attention.forward.qkv
| 0.494304
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.071904
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.080288
|
megatron.core.transformer.attention.forward.core_attention
| 2.02
|
megatron.core.transformer.attention.forward.linear_proj
| 0.351488
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3.44816
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.299264
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.272768
|
megatron.core.transformer.mlp.forward.activation
| 0.168128
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.203456
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.81632
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.263008
|
megatron.core.transformer.attention.forward.qkv
| 178.728409
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944
|
megatron.core.transformer.attention.forward.core_attention
| 5,016.333984
|
megatron.core.transformer.attention.forward.linear_proj
| 8.55792
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5,204.814941
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,132.86499
|
megatron.core.transformer.mlp.forward.linear_fc1
| 25.623039
|
megatron.core.transformer.mlp.forward.activation
| 491.546814
|
megatron.core.transformer.mlp.forward.linear_fc2
| 23.271616
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 540.455078
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 1.771232
|
megatron.core.transformer.attention.forward.qkv
| 10.45024
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002912
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002912
|
megatron.core.transformer.attention.forward.core_attention
| 324.864105
|
megatron.core.transformer.attention.forward.linear_proj
| 5.424416
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 340.762238
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1.785696
|
megatron.core.transformer.mlp.forward.linear_fc1
| 23.509727
|
megatron.core.transformer.mlp.forward.activation
| 3.06416
|
megatron.core.transformer.mlp.forward.linear_fc2
| 23.696672
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 50.282143
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 2.331904
|
megatron.core.transformer.attention.forward.qkv
| 170.767227
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.11776
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.091424
|
megatron.core.transformer.attention.forward.core_attention
| 3,078.283691
|
megatron.core.transformer.attention.forward.linear_proj
| 2.950528
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3,253.841797
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,018.423462
|
megatron.core.transformer.mlp.forward.linear_fc1
| 1.1888
|
megatron.core.transformer.mlp.forward.activation
| 429.187683
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.050208
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 431.898865
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.431136
|
megatron.core.transformer.attention.forward.qkv
| 0.80928
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.072512
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.083616
|
megatron.core.transformer.attention.forward.core_attention
| 3.255616
|
megatron.core.transformer.attention.forward.linear_proj
| 0.273696
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4.8344
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.3232
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.776
|
megatron.core.transformer.mlp.forward.activation
| 0.257024
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.644512
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.78576
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.446176
|
megatron.core.transformer.attention.forward.qkv
| 179.273087
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.112192
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.085664
|
megatron.core.transformer.attention.forward.core_attention
| 3,174.500488
|
megatron.core.transformer.attention.forward.linear_proj
| 5.465248
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 3,360.770264
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,032.895386
|
megatron.core.transformer.mlp.forward.linear_fc1
| 5.166848
|
megatron.core.transformer.mlp.forward.activation
| 449.477539
|
megatron.core.transformer.mlp.forward.linear_fc2
| 2.049984
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 457.317291
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.120672
|
megatron.core.transformer.attention.forward.qkv
| 0.7336
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.072608
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.08208
|
megatron.core.transformer.attention.forward.core_attention
| 7.001184
|
megatron.core.transformer.attention.forward.linear_proj
| 0.344896
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 8.444064
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.120416
|
megatron.core.transformer.mlp.forward.linear_fc1
| 1.522112
|
megatron.core.transformer.mlp.forward.activation
| 0.170592
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.386784
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 3.09184
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.12144
|
megatron.core.transformer.attention.forward.qkv
| 170.137924
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003072
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944
|
megatron.core.transformer.attention.forward.core_attention
| 4,039.765381
|
megatron.core.transformer.attention.forward.linear_proj
| 4.762688
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4,217.144043
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,143.829224
|
megatron.core.transformer.mlp.forward.linear_fc1
| 9.140128
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.