eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
ehristoforu_moremerge-upscaled_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/moremerge-upscaled" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/moremerge-upscaled</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__moremerge-upscaled-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/moremerge-upscaled
2b50cf76b49db95caee5943e8ecc32237bc59a32
3.918261
0
8.545
false
false
false
true
1.881269
0.197888
19.788827
0.269774
1.014763
0
0
0.246644
0
0.359302
2.246094
0.104139
0.459885
false
false
2025-01-27
2025-01-27
1
ehristoforu/moremerge-upscaled (Merge)
ehristoforu_phi-4-25b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/phi-4-25b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/phi-4-25b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__phi-4-25b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/phi-4-25b
d9a7690e89b7971d6462f5fff7591a5381b3c192
39.116158
7
24.883
false
false
false
true
4.663034
0.648366
64.836633
0.690778
55.672615
0.452417
45.241692
0.318792
9.17226
0.420792
11.432292
0.535073
48.34146
false
false
2025-01-12
2025-01-12
1
ehristoforu/phi-4-25b (Merge)
ehristoforu_qwen2.5-test-32b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/qwen2.5-test-32b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/qwen2.5-test-32b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__qwen2.5-test-32b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/qwen2.5-test-32b-it
6bcc8f1cedfe72471276d0159d1646be6ac50e40
47.368357
9
32.764
false
false
false
true
29.544034
0.78895
78.894999
0.708059
58.283307
0.597432
59.743202
0.364094
15.212528
0.457813
19.126563
0.576546
52.949542
false
false
2024-12-05
2024-12-07
1
ehristoforu/qwen2.5-test-32b-it (Merge)
ehristoforu_qwen2.5-with-lora-think-3b-it_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/qwen2.5-with-lora-think-3b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/qwen2.5-with-lora-think-3b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__qwen2.5-with-lora-think-3b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/qwen2.5-with-lora-think-3b-it
255e1d9c2eff51302276f99309c344179ee9d390
24.256524
other
0
3.086
true
false
false
true
1.534691
0.531937
53.193748
0.468685
25.079463
0.236405
23.640483
0.280201
4.026846
0.430958
12.903125
0.340259
26.695479
false
false
2025-01-10
2025-01-10
1
Qwen/Qwen2.5-3B
ehristoforu_rmoe-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/rmoe-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/rmoe-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__rmoe-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/rmoe-v1
e99909f057d4adc6476b906b2f0385e75f8271f8
5.841232
mit
0
11.026
true
true
false
true
6.069018
0.265008
26.500796
0.292929
2.067321
0.001511
0.151057
0.258389
1.118568
0.366344
3.826302
0.11245
1.383348
false
false
2025-01-31
2025-01-31
0
ehristoforu/rmoe-v1
ehristoforu_rufalcon3-3b-it_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/rufalcon3-3b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/rufalcon3-3b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__rufalcon3-3b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/rufalcon3-3b-it
8cf51e5308b4d18c1d67a882b87b81cbd1a46e84
20.641419
apache-2.0
0
3.228
true
false
false
true
0.981363
0.594211
59.421114
0.415542
18.214358
0.178248
17.824773
0.272651
3.020134
0.389531
10.391406
0.234791
14.976729
false
false
2025-01-02
2025-01-02
1
tiiuae/Falcon3-3B-Instruct
ehristoforu_ruphi-4b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/ruphi-4b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/ruphi-4b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__ruphi-4b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/ruphi-4b
228fc0c406609629e54068dbb7266f5a15ee89cc
4.080739
apache-2.0
0
3.821
true
false
false
true
1.100101
0.175182
17.518185
0.290603
2.400631
0
0
0.239933
0
0.351177
3.163802
0.112616
1.401817
false
false
2025-01-02
2025-01-02
2
microsoft/Phi-3.5-mini-instruct
ehristoforu_testq-32b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/testq-32b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/testq-32b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__testq-32b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/testq-32b
0affeb22ce5bcdd33e4e931f7bb2511349c69c4b
4.538544
0
56.165
false
false
false
true
94.464094
0.187597
18.759669
0.287655
1.932825
0.003021
0.302115
0.254195
0.559284
0.371458
3.832292
0.116606
1.84508
false
false
2025-01-19
2025-01-19
1
ehristoforu/testq-32b (Merge)
ehristoforu_tmoe_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/tmoe" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/tmoe</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__tmoe-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/tmoe
884bcb67207ebba2aaa909617185a7ef6459eae0
3.652325
apache-2.0
0
11.026
true
true
false
true
7.344955
0.119302
11.930234
0.307286
3.201361
0.007553
0.755287
0.223154
0
0.369906
3.904948
0.119099
2.122119
false
false
2025-01-23
2025-01-23
0
ehristoforu/tmoe
ehristoforu_tmoe-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/tmoe-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/tmoe-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__tmoe-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/tmoe-v2
8d3d79d90a5f44995b0186da53c6031b6951010e
5.712206
apache-2.0
0
11.026
true
true
false
true
7.721372
0.19027
19.02696
0.289674
2.062357
0.002266
0.226586
0.263423
1.789709
0.415083
10.052083
0.11004
1.115544
false
false
2025-01-24
2025-01-24
0
ehristoforu/tmoe-v2
ehristoforu_trd-7b-it_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/trd-7b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/trd-7b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__trd-7b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/trd-7b-it
9cb438f907825bc2b7bb6d9ed6a8fa2693abc7ec
6.323081
0
7.613
false
false
false
true
1.421302
0.218471
21.847143
0.299024
2.722591
0.031722
3.172205
0.270134
2.684564
0.379427
5.528385
0.117852
1.983599
false
false
2025-01-31
2025-01-31
1
ehristoforu/trd-7b-it (Merge)
ehristoforu_ud-14b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/ud-14b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/ud-14b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__ud-14b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/ud-14b
c80f651c121e601e50312b3376ba77a032d0bd34
16.222392
0
14.766
false
false
false
true
3.938749
0.423527
42.352735
0.332382
6.295264
0.190332
19.033233
0.237416
0
0.439427
13.928385
0.241523
15.724734
false
false
2025-01-28
2025-01-28
1
ehristoforu/ud-14b (Merge)
elinas_Chronos-Gold-12B-1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/elinas/Chronos-Gold-12B-1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">elinas/Chronos-Gold-12B-1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/elinas__Chronos-Gold-12B-1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
elinas/Chronos-Gold-12B-1.0
cf76a4621b9dfc0c2e6d930756e6c7c9ce2b260b
21.828168
apache-2.0
46
12.248
true
false
false
true
3.005062
0.316566
31.65656
0.551466
35.908947
0.069486
6.94864
0.317953
9.060403
0.47399
19.415365
0.351812
27.979093
true
false
2024-08-21
2024-09-15
1
mistralai/Mistral-Nemo-Base-2407
ell44ot_gemma-2b-def_float16
float16
🟢 pretrained
🟢
Original
GemmaModel
<a target="_blank" href="https://huggingface.co/ell44ot/gemma-2b-def" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ell44ot/gemma-2b-def</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ell44ot__gemma-2b-def-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ell44ot/gemma-2b-def
f9f1f882322360354fbc7a71d44d9b0b9ddd87ee
8.122919
apache-2.0
0
1.546
true
false
false
false
0.894595
0.269304
26.930433
0.315865
4.58642
0.024169
2.416918
0.27349
3.131991
0.367021
5.310938
0.157247
6.360816
false
false
2024-11-28
2024-11-28
1
ell44ot/gemma-2b-def (Merge)
euclaise_ReMask-3B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/euclaise/ReMask-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">euclaise/ReMask-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/euclaise__ReMask-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
euclaise/ReMask-3B
e094dae96097c2bc6f758101ee269c089b65a2cf
7.294405
cc-by-sa-4.0
15
2.795
true
false
false
true
0.893681
0.241927
24.192698
0.351678
8.742083
0.019637
1.963746
0.266779
2.237136
0.334094
2.661719
0.135721
3.969046
false
false
2024-03-28
2024-08-10
0
euclaise/ReMask-3B
eworojoshua_vas-01_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/eworojoshua/vas-01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">eworojoshua/vas-01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/eworojoshua__vas-01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
eworojoshua/vas-01
0eb818ab19c02344d853dccfe37dd459abf2ec04
36.466422
0
7.616
false
false
false
true
1.346624
0.761248
76.124793
0.541782
34.808538
0.473565
47.356495
0.309564
7.941834
0.44324
15.371615
0.434757
37.195257
false
false
2025-02-24
2025-02-26
1
eworojoshua/vas-01 (Merge)
ewre324_Thinker-Llama-3.2-3B-Instruct-Reasoning_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ewre324/Thinker-Llama-3.2-3B-Instruct-Reasoning" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ewre324/Thinker-Llama-3.2-3B-Instruct-Reasoning</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ewre324__Thinker-Llama-3.2-3B-Instruct-Reasoning-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ewre324/Thinker-Llama-3.2-3B-Instruct-Reasoning
7a425678e7770b059db7106f4c234895b975b705
17.331992
llama3.2
0
3.213
true
false
false
false
1.190366
0.443886
44.388556
0.427313
19.412583
0.084592
8.459215
0.276846
3.579418
0.365531
7.191406
0.288647
20.960771
false
false
2025-01-07
2025-01-07
0
ewre324/Thinker-Llama-3.2-3B-Instruct-Reasoning
ewre324_Thinker-Qwen2.5-0.5B-Instruct-Reasoning_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ewre324/Thinker-Qwen2.5-0.5B-Instruct-Reasoning" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ewre324/Thinker-Qwen2.5-0.5B-Instruct-Reasoning</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ewre324__Thinker-Qwen2.5-0.5B-Instruct-Reasoning-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ewre324/Thinker-Qwen2.5-0.5B-Instruct-Reasoning
e0d596dd855b37a444d275c9638ce7353b7ee5b6
8.066116
apache-2.0
0
0.494
true
false
false
false
1.02424
0.247647
24.764735
0.329212
7.39461
0.028701
2.870091
0.285235
4.697987
0.338219
1.477344
0.164727
7.191933
false
false
2025-01-07
2025-01-07
0
ewre324/Thinker-Qwen2.5-0.5B-Instruct-Reasoning
ewre324_Thinker-SmolLM2-135M-Instruct-Reasoning_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ewre324/Thinker-SmolLM2-135M-Instruct-Reasoning" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ewre324/Thinker-SmolLM2-135M-Instruct-Reasoning</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ewre324__Thinker-SmolLM2-135M-Instruct-Reasoning-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ewre324/Thinker-SmolLM2-135M-Instruct-Reasoning
7dbd1a18e98892dbff1c6a51550ded17398e8518
5.843149
apache-2.0
1
0.135
true
false
false
false
0.668103
0.258363
25.836336
0.307135
3.973353
0.009063
0.906344
0.252517
0.33557
0.366125
2.965625
0.109375
1.041667
false
false
2025-01-07
2025-01-07
1
ewre324/Thinker-SmolLM2-135M-Instruct-Reasoning (Merge)
ewre324_ewre324-R1-SmolLM2-135M-Distill_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ewre324/ewre324-R1-SmolLM2-135M-Distill" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ewre324/ewre324-R1-SmolLM2-135M-Distill</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ewre324__ewre324-R1-SmolLM2-135M-Distill-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ewre324/ewre324-R1-SmolLM2-135M-Distill
3592b13d6df2b6090819afed0be93b374b649b8d
4.164699
0
0.135
false
false
false
true
0.705821
0.16489
16.489027
0.30417
3.383004
0.01284
1.283988
0.261745
1.565996
0.340917
0.78125
0.113364
1.484929
false
false
2025-01-30
2025-01-30
1
ewre324/ewre324-R1-SmolLM2-135M-Distill (Merge)
experiment-llm_exp-3-q-r_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/experiment-llm/exp-3-q-r" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">experiment-llm/exp-3-q-r</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/experiment-llm__exp-3-q-r-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
experiment-llm/exp-3-q-r
d4300d83f75f6d95fe44a18aa0099e37dcd7868a
29.50441
apache-2.0
0
7.616
true
false
false
true
1.467311
0.603579
60.357851
0.539716
33.994917
0.278701
27.870091
0.293624
5.816555
0.431542
12.142708
0.431599
36.844341
false
false
2024-12-02
2024-12-02
4
rombodawg/Rombos-LLM-V2.5-Qwen-7b (Merge)
facebook_opt-1.3b_float16
float16
🟢 pretrained
🟢
Original
OPTForCausalLM
<a target="_blank" href="https://huggingface.co/facebook/opt-1.3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">facebook/opt-1.3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/facebook__opt-1.3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
facebook/opt-1.3b
3f5c25d0bc631cb57ac65913f76e22c2dfb61d62
5.276689
other
168
1.3
true
false
false
false
0.80601
0.23833
23.832985
0.309395
3.648052
0.009063
0.906344
0.24245
0
0.342
2.083333
0.110705
1.189421
false
true
2022-05-11
2024-06-12
0
facebook/opt-1.3b
facebook_opt-30b_float16
float16
🟢 pretrained
🟢
Original
OPTForCausalLM
<a target="_blank" href="https://huggingface.co/facebook/opt-30b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">facebook/opt-30b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/facebook__opt-30b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
facebook/opt-30b
ceea0a90ac0f6fae7c2c34bcb40477438c152546
6.276874
other
133
30
true
false
false
false
5.99969
0.245299
24.529914
0.307034
3.498429
0.010574
1.057402
0.269295
2.572707
0.360417
4.185417
0.116356
1.817376
false
true
2022-05-11
2024-06-12
0
facebook/opt-30b
failspy_Llama-3-8B-Instruct-MopeyMule_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/failspy/Llama-3-8B-Instruct-MopeyMule" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/Llama-3-8B-Instruct-MopeyMule</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__Llama-3-8B-Instruct-MopeyMule-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/Llama-3-8B-Instruct-MopeyMule
d1cbf407efe727c6b9fc94f22d51ff4915e1856e
15.638133
other
78
8.03
true
false
false
true
1.646272
0.675044
67.504444
0.383874
13.620496
0.019637
1.963746
0.239094
0
0.351302
2.246094
0.176446
8.494016
false
false
2024-05-30
2024-09-21
0
failspy/Llama-3-8B-Instruct-MopeyMule
failspy_Llama-3-8B-Instruct-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/failspy/Llama-3-8B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/Llama-3-8B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__Llama-3-8B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/Llama-3-8B-Instruct-abliterated
dd67dd055661e4cbcedb0ed2431693d9cc3be6e0
19.190256
llama3
8
8.03
true
false
false
true
1.483812
0.590889
59.088884
0.435375
18.864599
0.03852
3.851964
0.276007
3.467562
0.411583
10.514583
0.274186
19.353945
false
false
2024-05-07
2024-07-03
0
failspy/Llama-3-8B-Instruct-abliterated
failspy_Meta-Llama-3-70B-Instruct-abliterated-v3.5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__Meta-Llama-3-70B-Instruct-abliterated-v3.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5
fc951b03d92972ab52ad9392e620eba6173526b9
30.129354
llama3
43
70.554
true
false
false
true
18.409422
0.774687
77.468672
0.57471
37.871333
0.128399
12.839879
0.29698
6.263982
0.398187
7.973438
0.445229
38.358821
false
false
2024-05-28
2024-08-30
0
failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5
failspy_Meta-Llama-3-8B-Instruct-abliterated-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/failspy/Meta-Llama-3-8B-Instruct-abliterated-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/Meta-Llama-3-8B-Instruct-abliterated-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__Meta-Llama-3-8B-Instruct-abliterated-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/Meta-Llama-3-8B-Instruct-abliterated-v3
85a25be002841fe738a5267b6806473f36f86715
23.93309
llama3
47
8.03
true
false
false
true
0.652527
0.724453
72.445334
0.492456
27.335051
0.095921
9.592145
0.264262
1.901566
0.362187
2.840104
0.365359
29.484338
false
false
2024-05-20
2025-02-12
0
failspy/Meta-Llama-3-8B-Instruct-abliterated-v3
failspy_Phi-3-medium-4k-instruct-abliterated-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/failspy/Phi-3-medium-4k-instruct-abliterated-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/Phi-3-medium-4k-instruct-abliterated-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__Phi-3-medium-4k-instruct-abliterated-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/Phi-3-medium-4k-instruct-abliterated-v3
959b09eacf6cae85a8eb21b25e998addc89a367b
31.851121
mit
23
13.96
true
false
false
true
3.041962
0.63193
63.192995
0.63048
46.732839
0.159366
15.936556
0.317114
8.948546
0.460417
18.51875
0.439993
37.777039
false
false
2024-05-22
2024-07-29
0
failspy/Phi-3-medium-4k-instruct-abliterated-v3
failspy_llama-3-70B-Instruct-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/failspy/llama-3-70B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/llama-3-70B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__llama-3-70B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/llama-3-70B-Instruct-abliterated
53ae9dafe8b3d163e05d75387575f8e9f43253d0
35.890019
llama3
104
70.554
true
false
false
true
18.748258
0.802339
80.233891
0.646485
48.939818
0.243202
24.320242
0.28943
5.257271
0.41276
10.528385
0.514545
46.060505
false
false
2024-05-07
2024-07-03
0
failspy/llama-3-70B-Instruct-abliterated
fblgit_TheBeagle-v2beta-32B-MGS_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/TheBeagle-v2beta-32B-MGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/TheBeagle-v2beta-32B-MGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__TheBeagle-v2beta-32B-MGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/TheBeagle-v2beta-32B-MGS
56830f63e4a40378b7721ae966637b4678cc8784
42.642045
other
17
32.764
true
false
false
false
43.410161
0.518074
51.807427
0.703263
58.027976
0.494713
49.471299
0.38255
17.673378
0.50075
24.260417
0.591506
54.611776
false
false
2024-10-20
2024-10-30
1
fblgit/TheBeagle-v2beta-32B-MGS (Merge)
fblgit_TheBeagle-v2beta-32B-MGS_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/TheBeagle-v2beta-32B-MGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/TheBeagle-v2beta-32B-MGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__TheBeagle-v2beta-32B-MGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/TheBeagle-v2beta-32B-MGS
56830f63e4a40378b7721ae966637b4678cc8784
40.28667
other
17
32.764
true
false
false
false
11.366068
0.450305
45.030519
0.703542
58.06603
0.39426
39.425982
0.401007
20.134228
0.502115
24.497656
0.59109
54.565603
false
false
2024-10-20
2024-10-20
1
fblgit/TheBeagle-v2beta-32B-MGS (Merge)
fblgit_UNA-SimpleSmaug-34b-v1beta_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/UNA-SimpleSmaug-34b-v1beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/UNA-SimpleSmaug-34b-v1beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__UNA-SimpleSmaug-34b-v1beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/UNA-SimpleSmaug-34b-v1beta
4b62fccfc7e44c0a02c11a5279d98fafa6b922ba
24.292092
apache-2.0
21
34.389
true
false
false
true
6.328932
0.455626
45.562552
0.528665
32.775789
0.071752
7.175227
0.317114
8.948546
0.425563
11.961979
0.453956
39.328457
false
false
2024-02-05
2024-06-30
2
jondurbin/bagel-34b-v0.2
fblgit_UNA-TheBeagle-7b-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/UNA-TheBeagle-7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/UNA-TheBeagle-7b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__UNA-TheBeagle-7b-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/UNA-TheBeagle-7b-v1
866d3ee19f983728e21a624f8a27574960073f27
19.646171
cc-by-nc-nd-4.0
37
7.242
true
false
false
false
1.121278
0.368872
36.887237
0.502869
30.173397
0.077039
7.703927
0.284396
4.58613
0.456438
16.088021
0.301945
22.438313
false
false
2024-01-09
2024-06-30
0
fblgit/UNA-TheBeagle-7b-v1
fblgit_UNA-ThePitbull-21.4B-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/UNA-ThePitbull-21.4B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/UNA-ThePitbull-21.4B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__UNA-ThePitbull-21.4B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/UNA-ThePitbull-21.4B-v2
f12aac93ae9c852550a16816e16116c4f8e7dec0
23.026569
afl-3.0
16
21.421
true
false
false
true
4.596828
0.379039
37.903873
0.635039
46.788074
0.121601
12.160121
0.302013
6.935123
0.392167
6.420833
0.351563
27.951389
false
false
2024-05-28
2024-06-30
0
fblgit/UNA-ThePitbull-21.4B-v2
fblgit_cybertron-v4-qw7B-MGS_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/cybertron-v4-qw7B-MGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/cybertron-v4-qw7B-MGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__cybertron-v4-qw7B-MGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/cybertron-v4-qw7B-MGS
ea2aaf4f4000190235722a9ad4f5cd9e9091a64e
32.403519
other
15
7.616
true
false
false
false
2.493477
0.626385
62.638466
0.559177
37.041623
0.348943
34.89426
0.310403
8.053691
0.437094
13.203385
0.447307
38.589687
false
false
2024-10-29
2024-10-29
1
fblgit/cybertron-v4-qw7B-MGS (Merge)
fblgit_cybertron-v4-qw7B-UNAMGS_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/cybertron-v4-qw7B-UNAMGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/cybertron-v4-qw7B-UNAMGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__cybertron-v4-qw7B-UNAMGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/cybertron-v4-qw7B-UNAMGS
ce9b1e991908f5b89f63a2e3212cf9a066906ed2
33.059494
other
9
7.616
true
false
false
false
1.990658
0.609024
60.902406
0.564251
37.707173
0.373112
37.311178
0.331376
10.850112
0.434333
12.691667
0.45005
38.89443
false
false
2024-11-18
2024-11-18
1
fblgit/cybertron-v4-qw7B-UNAMGS (Merge)
fblgit_juanako-7b-UNA_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/juanako-7b-UNA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/juanako-7b-UNA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__juanako-7b-UNA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/juanako-7b-UNA
b8ac85b603d5ee1ac619b2e1d0b3bb86c4eecb0c
20.863068
apache-2.0
23
7.242
true
false
false
false
1.263581
0.483728
48.372762
0.507001
30.415072
0.033988
3.398792
0.296141
6.152125
0.4645
17.1625
0.277094
19.677157
false
false
2023-11-27
2024-06-30
0
fblgit/juanako-7b-UNA
fblgit_miniclaus-qw1.5B-UNAMGS_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/miniclaus-qw1.5B-UNAMGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/miniclaus-qw1.5B-UNAMGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__miniclaus-qw1.5B-UNAMGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/miniclaus-qw1.5B-UNAMGS
de590536ba82ffb7b4001dffb5f8b60d2087c319
17.045102
other
8
1.777
true
false
false
false
1.168256
0.334801
33.480055
0.423859
18.562864
0.108761
10.876133
0.291946
5.592841
0.429344
12.234635
0.293717
21.524084
false
false
2024-11-01
2024-11-01
2
Qwen/Qwen2.5-1.5B
fblgit_miniclaus-qw1.5B-UNAMGS-GRPO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/miniclaus-qw1.5B-UNAMGS-GRPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/miniclaus-qw1.5B-UNAMGS-GRPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__miniclaus-qw1.5B-UNAMGS-GRPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/miniclaus-qw1.5B-UNAMGS-GRPO
16df845e8a3e6160ea185891b0c19f9c951eaea7
17.440457
other
5
1.544
true
false
false
false
1.744523
0.351836
35.183646
0.423443
18.626616
0.110272
11.02719
0.297819
6.375839
0.425437
11.813021
0.294548
21.61643
false
false
2025-02-03
2025-02-03
3
Qwen/Qwen2.5-1.5B
fblgit_pancho-v1-qw25-3B-UNAMGS_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/pancho-v1-qw25-3B-UNAMGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/pancho-v1-qw25-3B-UNAMGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__pancho-v1-qw25-3B-UNAMGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/pancho-v1-qw25-3B-UNAMGS
01143501cbc2c90961be5397c6945c6789815a60
23.860635
other
3
3.397
true
false
false
false
2.353076
0.536134
53.613412
0.492583
28.66965
0.1571
15.70997
0.29698
6.263982
0.40274
8.175781
0.376579
30.731014
false
false
2024-11-04
2024-11-12
2
Qwen/Qwen2.5-3B
fblgit_una-cybertron-7b-v2-bf16_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/una-cybertron-7b-v2-bf16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/una-cybertron-7b-v2-bf16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__una-cybertron-7b-v2-bf16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/una-cybertron-7b-v2-bf16
7ab101a153740aec39e95ec02831c56f4eab7910
17.217325
apache-2.0
116
7.242
true
false
false
true
1.268411
0.473711
47.371086
0.397339
14.966965
0.040785
4.07855
0.297819
6.375839
0.447323
14.482031
0.244265
16.029477
false
false
2023-12-02
2024-06-30
0
fblgit/una-cybertron-7b-v2-bf16
fhai50032_RolePlayLake-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fhai50032/RolePlayLake-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fhai50032/RolePlayLake-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fhai50032__RolePlayLake-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fhai50032/RolePlayLake-7B
4b1a6477bbdf6ce4a384d7e7ec1d99641f142bde
22.754718
apache-2.0
13
7.242
true
false
false
false
0.865253
0.505659
50.565943
0.525217
33.479586
0.072508
7.250755
0.303691
7.158837
0.445927
14.074219
0.315991
23.998966
true
false
2024-01-29
2025-01-17
1
fhai50032/RolePlayLake-7B (Merge)
fhai50032_Unaligned-Thinker-PHI-4_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fhai50032/Unaligned-Thinker-PHI-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fhai50032/Unaligned-Thinker-PHI-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fhai50032__Unaligned-Thinker-PHI-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fhai50032/Unaligned-Thinker-PHI-4
8f394ab92e3605d0e352585450ed97d2c4074d45
28.899268
apache-2.0
1
14.66
true
false
false
false
1.804352
0.056254
5.625407
0.664258
51.925049
0.335347
33.534743
0.380872
17.449664
0.467854
18.781771
0.514711
46.078975
false
false
2025-01-16
2025-01-17
2
microsoft/phi-4
flammenai_Llama3.1-Flammades-70B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Llama3.1-Flammades-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Llama3.1-Flammades-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__Llama3.1-Flammades-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Llama3.1-Flammades-70B
48909a734460e667e3a7e91bd25f124ec3b2ba74
36.994121
llama3.1
2
70.554
true
false
false
true
20.569666
0.705844
70.584383
0.665972
52.547943
0.209215
20.92145
0.354027
13.870246
0.487052
22.348177
0.475233
41.692524
false
false
2024-10-12
2024-10-13
1
flammenai/Llama3.1-Flammades-70B (Merge)
flammenai_Mahou-1.2a-llama3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.2a-llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.2a-llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__Mahou-1.2a-llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.2a-llama3-8B
3318b6f5f1839644bee287a3e5390f3e9f565a9e
21.791262
llama3
6
8.03
true
false
false
false
1.864824
0.509257
50.925655
0.509366
28.972588
0.083837
8.383686
0.288591
5.145414
0.384667
6.016667
0.381732
31.303561
false
false
2024-05-25
2024-09-03
1
flammenai/Mahou-1.2a-llama3-8B (Merge)
flammenai_Mahou-1.2a-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.2a-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.2a-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__Mahou-1.2a-mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.2a-mistral-7B
d45f61cca04da0c3359573102853fca1a0d3b252
19.578991
apache-2.0
6
7.242
true
false
false
false
2.414893
0.455201
45.520109
0.511811
31.16675
0.068731
6.873112
0.271812
2.908277
0.389625
6.969792
0.316323
24.035904
false
false
2024-05-18
2024-09-03
1
flammenai/Mahou-1.2a-mistral-7B (Merge)
flammenai_Mahou-1.5-llama3.1-70B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.5-llama3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.5-llama3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__Mahou-1.5-llama3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.5-llama3.1-70B
49f45cc4c21e2ba7ed5c5e71f90ffd0bd9169e2d
37.344913
llama3.1
7
70.554
true
false
false
true
20.519985
0.714662
71.466154
0.665086
52.369577
0.20997
20.996979
0.354027
13.870246
0.495021
23.710938
0.4749
41.655585
false
false
2024-10-14
2024-10-14
1
flammenai/Mahou-1.5-llama3.1-70B (Merge)
flammenai_Mahou-1.5-mistral-nemo-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.5-mistral-nemo-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.5-mistral-nemo-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__Mahou-1.5-mistral-nemo-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.5-mistral-nemo-12B
852561e74f1785bf7225bb28395db1fd9431fe31
26.885326
apache-2.0
19
12.248
true
false
false
true
2.965264
0.675144
67.514417
0.552236
36.26051
0.086858
8.685801
0.276007
3.467562
0.452042
16.471875
0.360206
28.911791
false
false
2024-10-06
2024-10-07
1
flammenai/Mahou-1.5-mistral-nemo-12B (Merge)
flammenai_flammen15-gutenberg-DPO-v1-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/flammen15-gutenberg-DPO-v1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/flammen15-gutenberg-DPO-v1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__flammen15-gutenberg-DPO-v1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/flammen15-gutenberg-DPO-v1-7B
550cd9548cba1265cb1771c85ebe498789fdecb5
21.612698
apache-2.0
3
7.242
true
false
false
false
1.255059
0.479806
47.98058
0.520298
32.665113
0.076284
7.628399
0.284396
4.58613
0.429313
12.530729
0.318567
24.285239
false
false
2024-04-05
2024-07-10
1
flammenai/flammen15-gutenberg-DPO-v1-7B (Merge)
fluently-lm_FluentlyLM-Prinum_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fluently-lm/FluentlyLM-Prinum" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fluently-lm/FluentlyLM-Prinum</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fluently-lm__FluentlyLM-Prinum-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fluently-lm/FluentlyLM-Prinum
3f5e29069004437ce567183dfe2eb1fce262fede
47.216938
mit
23
32.764
true
false
false
true
21.252761
0.809033
80.903336
0.714381
59.482203
0.54003
54.003021
0.386745
18.232662
0.447146
17.259896
0.580785
53.420508
false
false
2025-02-16
2025-02-16
0
fluently-lm/FluentlyLM-Prinum
fluently-lm_Llama-TI-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fluently-lm/Llama-TI-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fluently-lm/Llama-TI-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fluently-lm__Llama-TI-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fluently-lm/Llama-TI-8B
2ab7eb6daca1c850cc65cec04f4d374b1041d824
21.06212
apache-2.0
2
8.03
true
false
false
false
1.360785
0.288039
28.803907
0.520086
31.984333
0.196375
19.637462
0.296141
6.152125
0.410271
12.683854
0.343999
27.111037
false
false
2024-12-07
2024-12-07
1
meta-llama/Llama-3.1-8B
fluently-lm_Llama-TI-8B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fluently-lm/Llama-TI-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fluently-lm/Llama-TI-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fluently-lm__Llama-TI-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fluently-lm/Llama-TI-8B-Instruct
e93cf77978d3a1589de67df226b62383c460c15c
29.671617
apache-2.0
2
8.03
true
false
false
true
1.394032
0.771639
77.163925
0.525214
32.266867
0.230363
23.036254
0.295302
6.040268
0.381344
9.234635
0.37259
30.287751
false
false
2024-12-07
2025-01-16
2
meta-llama/Meta-Llama-3.1-8B
fluently-sets_FalconThink3-10B-IT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fluently-sets/FalconThink3-10B-IT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fluently-sets/FalconThink3-10B-IT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fluently-sets__FalconThink3-10B-IT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fluently-sets/FalconThink3-10B-IT
1a352e2efc0369ebd05a143d24e37124a2239b36
34.620674
apache-2.0
3
10.306
true
false
false
true
1.746814
0.732622
73.262167
0.620017
44.975804
0.244713
24.471299
0.334732
11.297539
0.447885
15.552344
0.443484
38.164894
false
false
2024-12-29
2024-12-29
2
tiiuae/Falcon3-10B-Base
fluently-sets_reasoning-1-1k-demo_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fluently-sets/reasoning-1-1k-demo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fluently-sets/reasoning-1-1k-demo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fluently-sets__reasoning-1-1k-demo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fluently-sets/reasoning-1-1k-demo
483acbedc44430c2fcc2552a8f1121457c43035b
38.341664
apache-2.0
1
14.77
true
false
false
true
3.287029
0.75248
75.248009
0.639669
48.944034
0.428248
42.824773
0.33557
11.409396
0.406063
9.691146
0.477394
41.932624
false
false
2024-12-21
2024-12-21
2
Qwen/Qwen2.5-14B
formulae_mita-elite-sce-gen1.1-v1-7b-2-26-2025-exp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/formulae/mita-elite-sce-gen1.1-v1-7b-2-26-2025-exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">formulae/mita-elite-sce-gen1.1-v1-7b-2-26-2025-exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/formulae__mita-elite-sce-gen1.1-v1-7b-2-26-2025-exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
formulae/mita-elite-sce-gen1.1-v1-7b-2-26-2025-exp
074a3c6455afd78e7ed02e5c8aa176cb02ce71c9
5.330152
0
7.616
false
false
false
true
0.699944
0.161393
16.139288
0.297639
1.972762
0.001511
0.151057
0.253356
0.447427
0.421938
11.342188
0.117354
1.928191
false
false
2025-02-25
2025-02-25
1
formulae/mita-elite-sce-gen1.1-v1-7b-2-26-2025-exp (Merge)
formulae_mita-elite-v1.1-7b-2-25-2025_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/formulae/mita-elite-v1.1-7b-2-25-2025" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">formulae/mita-elite-v1.1-7b-2-25-2025</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/formulae__mita-elite-v1.1-7b-2-25-2025-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
formulae/mita-elite-v1.1-7b-2-25-2025
8a144b6296deed37005fd5e8520e76c5ef740d25
2.766025
0
7.616
false
false
false
true
0.697261
0.124973
12.497285
0.286737
1.25321
0
0
0.248322
0
0.348729
1.757812
0.109791
1.08784
false
false
2025-02-25
2025-02-25
1
formulae/mita-elite-v1.1-7b-2-25-2025 (Merge)
formulae_mita-elite-v1.1-gen2-7b-2-25-2025_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/formulae/mita-elite-v1.1-gen2-7b-2-25-2025" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">formulae/mita-elite-v1.1-gen2-7b-2-25-2025</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/formulae__mita-elite-v1.1-gen2-7b-2-25-2025-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
formulae/mita-elite-v1.1-gen2-7b-2-25-2025
6d7680a5ae3a6ac96464856180671e95051a814e
3.17751
0
7.616
false
false
false
true
0.684578
0.141085
14.108454
0.292375
1.901203
0
0
0.252517
0.33557
0.354094
1.595052
0.110123
1.124778
false
false
2025-02-25
2025-02-25
1
formulae/mita-elite-v1.1-gen2-7b-2-25-2025 (Merge)
formulae_mita-elite-v1.2-7b-2-26-2025_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/formulae/mita-elite-v1.2-7b-2-26-2025" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">formulae/mita-elite-v1.2-7b-2-26-2025</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/formulae__mita-elite-v1.2-7b-2-26-2025-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
formulae/mita-elite-v1.2-7b-2-26-2025
813303bd136b6fbdf7282470e131a68992d3b7f8
5.726217
0
7.616
false
false
false
true
0.687811
0.148004
14.800396
0.293005
1.736426
0.002266
0.226586
0.274329
3.243848
0.428667
12.283333
0.1186
2.066711
false
false
2025-02-25
2025-02-25
1
formulae/mita-elite-v1.2-7b-2-26-2025 (Merge)
formulae_mita-gen3-7b-2-26-2025_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/formulae/mita-gen3-7b-2-26-2025" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">formulae/mita-gen3-7b-2-26-2025</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/formulae__mita-gen3-7b-2-26-2025-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
formulae/mita-gen3-7b-2-26-2025
ce4eeafb6839c3230f892b9f3328f1a5b1dcfe2d
5.370051
0
7.616
false
false
false
true
0.689051
0.196414
19.64144
0.291571
2.497037
0.002266
0.226586
0.265101
2.013423
0.391208
6.467708
0.112367
1.374113
false
false
2025-02-25
2025-02-25
1
formulae/mita-gen3-7b-2-26-2025 (Merge)
formulae_mita-gen3-v1.2-7b-2-26-2025_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/formulae/mita-gen3-v1.2-7b-2-26-2025" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">formulae/mita-gen3-v1.2-7b-2-26-2025</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/formulae__mita-gen3-v1.2-7b-2-26-2025-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
formulae/mita-gen3-v1.2-7b-2-26-2025
2ca9b92a98b1bc0873bb48f97e488955e5d453b1
5.484287
0
7.616
false
false
false
true
0.689379
0.204358
20.435777
0.305775
2.97598
0.002266
0.226586
0.259228
1.230425
0.39
6.616667
0.112783
1.420287
false
false
2025-02-25
2025-02-25
1
formulae/mita-gen3-v1.2-7b-2-26-2025 (Merge)
formulae_mita-math-v2.3-2-25-2025_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/formulae/mita-math-v2.3-2-25-2025" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">formulae/mita-math-v2.3-2-25-2025</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/formulae__mita-math-v2.3-2-25-2025-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
formulae/mita-math-v2.3-2-25-2025
ce5524268d5d97c766f1ed8675a5ae3b6137f57f
3.592858
0
7.616
false
false
false
true
0.678072
0.137338
13.733782
0.29494
2.483287
0
0
0.250839
0.111857
0.36975
3.91875
0.111785
1.309471
false
false
2025-02-25
2025-02-25
1
formulae/mita-math-v2.3-2-25-2025 (Merge)
formulae_mita-v1-7b_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/formulae/mita-v1-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">formulae/mita-v1-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/formulae__mita-v1-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
formulae/mita-v1-7b
bc7f2503c0f365135dde256aaab0f3034b5141a4
5.85262
1
7.616
false
false
false
false
0.677292
0.197239
19.723888
0.300322
2.731524
0.002266
0.226586
0.25
0
0.415208
10.801042
0.114694
1.632683
false
false
2025-02-24
2025-02-24
1
formulae/mita-v1-7b (Merge)
formulae_mita-v1.1-7b-2-24-2025_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/formulae/mita-v1.1-7b-2-24-2025" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">formulae/mita-v1.1-7b-2-24-2025</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/formulae__mita-v1.1-7b-2-24-2025-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
formulae/mita-v1.1-7b-2-24-2025
14d93a482020ed18bd43a886d250d5abc0eebff6
29.482611
0
7.616
false
false
false
false
0.666019
0.34122
34.122018
0.544243
35.4409
0.435045
43.504532
0.314597
8.612975
0.455698
16.06224
0.452377
39.152999
false
false
2025-02-24
2025-02-24
1
formulae/mita-v1.1-7b-2-24-2025 (Merge)
formulae_mita-v1.2-7b-2-24-2025_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/formulae/mita-v1.2-7b-2-24-2025" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">formulae/mita-v1.2-7b-2-24-2025</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/formulae__mita-v1.2-7b-2-24-2025-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
formulae/mita-v1.2-7b-2-24-2025
324ed6a142141697773e3d627365a6c297d6ae58
24.86325
0
7.616
false
false
false
false
0.637931
0.256415
25.64152
0.491946
28.413177
0.487915
48.791541
0.306208
7.494407
0.434396
12.632813
0.335854
26.206043
false
false
2025-02-24
2025-02-24
1
formulae/mita-v1.2-7b-2-24-2025 (Merge)
frameai_Loxa-4B_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/frameai/Loxa-4B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">frameai/Loxa-4B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/frameai__Loxa-4B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
frameai/Loxa-4B
502156f3d50b94d48a7a7c0569c8c0b7492ff02a
17.545787
0
4.018
false
false
false
false
1.544927
0.476484
47.648351
0.421714
18.307903
0.109517
10.951662
0.283557
4.474273
0.337656
3.873698
0.28017
20.018839
false
false
2025-01-13
2025-01-14
0
frameai/Loxa-4B
freewheelin_free-evo-qwen72b-v0.8-re_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freewheelin/free-evo-qwen72b-v0.8-re" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freewheelin/free-evo-qwen72b-v0.8-re</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/freewheelin__free-evo-qwen72b-v0.8-re-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freewheelin/free-evo-qwen72b-v0.8-re
24e301d8fbef8ada12be42156b01c827ff594962
32.474931
mit
4
72.288
true
false
false
false
23.579581
0.533087
53.308665
0.612748
45.317403
0.180514
18.05136
0.356544
14.205817
0.487167
20.9625
0.487035
43.003842
false
false
2024-05-02
2024-09-15
0
freewheelin/free-evo-qwen72b-v0.8-re
freewheelin_free-solar-evo-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freewheelin/free-solar-evo-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freewheelin/free-solar-evo-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/freewheelin__free-solar-evo-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freewheelin/free-solar-evo-v0.1
233efd607ae0abbd7b46eded2ee7889892b7bdbb
16.421452
mit
1
10.732
true
false
false
true
1.602222
0.205007
20.500716
0.450221
22.635183
0.008308
0.830816
0.291107
5.480984
0.494583
22.25625
0.341423
26.824764
false
false
2024-04-18
2024-08-07
0
freewheelin/free-solar-evo-v0.1
freewheelin_free-solar-evo-v0.11_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freewheelin/free-solar-evo-v0.11" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freewheelin/free-solar-evo-v0.11</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/freewheelin__free-solar-evo-v0.11-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freewheelin/free-solar-evo-v0.11
17fc24a557bd3c3836abc9f6a367c803cba0cccd
16.779763
mit
0
10.732
true
false
false
true
1.627004
0.202659
20.265894
0.454516
23.182425
0.008308
0.830816
0.285235
4.697987
0.505219
24.285677
0.346742
27.41578
false
false
2024-04-24
2024-08-07
0
freewheelin/free-solar-evo-v0.11
freewheelin_free-solar-evo-v0.13_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freewheelin/free-solar-evo-v0.13" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freewheelin/free-solar-evo-v0.13</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/freewheelin__free-solar-evo-v0.13-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freewheelin/free-solar-evo-v0.13
2a7eb72f84c54898630f9db470eee0f936a64396
17.405901
mit
1
10.732
true
false
false
true
1.631912
0.23206
23.205982
0.455484
23.354204
0.012085
1.208459
0.288591
5.145414
0.505156
24.077865
0.346991
27.443484
false
false
2024-04-28
2024-08-07
0
freewheelin/free-solar-evo-v0.13
fulim_FineLlama-3.1-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fulim/FineLlama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fulim/FineLlama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fulim__FineLlama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fulim/FineLlama-3.1-8B
a0e5599180fe810c2be5310196d07d1619b6f8e7
13.250844
apache-2.0
0
8
true
false
false
false
1.59188
0.143883
14.388268
0.456921
22.462597
0.047583
4.758308
0.292785
5.704698
0.38674
8.109115
0.316739
24.082077
false
false
2024-12-15
2024-12-15
2
meta-llama/Meta-Llama-3.1-8B
gabrielmbmb_SmolLM-1.7B-Instruct-IFEval_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gabrielmbmb/SmolLM-1.7B-Instruct-IFEval" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gabrielmbmb/SmolLM-1.7B-Instruct-IFEval</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gabrielmbmb__SmolLM-1.7B-Instruct-IFEval-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gabrielmbmb/SmolLM-1.7B-Instruct-IFEval
ac5d711adc05ccfe1b1b912d5561d98f6afeeb40
5.399069
0
1.711
false
false
false
true
0.269491
0.230586
23.058596
0.313843
4.501675
0.010574
1.057402
0.253356
0.447427
0.33276
1.595052
0.115608
1.734264
false
false
2024-10-01
2024-10-11
2
HuggingFaceTB/SmolLM-1.7B
gaverfraxz_Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gaverfraxz__Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA
6b0271a98b8875a65972ed54b0d636d8236ea60b
12.108404
llama3.1
0
8.03
true
false
false
false
2.691348
0.400946
40.094616
0.398484
15.276579
0.019637
1.963746
0.284396
4.58613
0.365042
3.463542
0.165392
7.26581
true
false
2024-09-22
2024-09-23
1
gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA (Merge)
gaverfraxz_Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gaverfraxz__Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES
80569e49b5aba960a5cd91281dd9eef92aeff9a3
20.999042
llama3.1
1
8.03
true
false
false
true
1.922714
0.455051
45.505149
0.504366
28.914235
0.129909
12.990937
0.266779
2.237136
0.37375
6.585417
0.367852
29.761377
true
false
2024-09-19
2024-09-19
1
gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES (Merge)
gbueno86_Brinebreath-Llama-3.1-70B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gbueno86/Brinebreath-Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gbueno86/Brinebreath-Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gbueno86__Brinebreath-Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gbueno86/Brinebreath-Llama-3.1-70B
c508ecf356167e8c498c6fa3937ba30a82208983
36.254992
llama3.1
4
70.554
true
false
false
true
21.119509
0.553295
55.329526
0.688056
55.463618
0.297583
29.758308
0.346477
12.863535
0.454063
17.491146
0.519614
46.623818
true
false
2024-08-23
2024-08-29
1
gbueno86/Brinebreath-Llama-3.1-70B (Merge)
gbueno86_Meta-LLama-3-Cat-Smaug-LLama-70b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gbueno86__Meta-LLama-3-Cat-Smaug-LLama-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b
2d73b7e1c7157df482555944d6a6b1362bc6c3c5
38.696133
llama3
1
70.554
true
false
false
true
21.804586
0.807185
80.718494
0.667431
51.508386
0.293807
29.380665
0.327181
10.290828
0.436823
15.002865
0.50748
45.275561
true
false
2024-05-24
2024-06-27
1
gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b (Merge)
ghost-x_ghost-8b-beta-1608_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ghost-x/ghost-8b-beta-1608" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ghost-x/ghost-8b-beta-1608</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ghost-x__ghost-8b-beta-1608-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ghost-x/ghost-8b-beta-1608
6d1b3853aab774af5a4db21ff9d5764918fb48f5
16.047244
other
31
8.03
true
false
false
true
1.697862
0.427274
42.727408
0.451655
23.463964
0.069486
6.94864
0.258389
1.118568
0.351583
1.58125
0.283993
20.443632
false
false
2024-08-18
2024-09-17
1
ghost-x/ghost-8b-beta
glaiveai_Reflection-Llama-3.1-70B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/glaiveai/Reflection-Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">glaiveai/Reflection-Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/glaiveai__Reflection-Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
glaiveai/Reflection-Llama-3.1-70B
086bd2658e00345808b31758ebb8f7e2c6d9897c
34.519478
0
69.5
false
false
false
true
50.487553
0.599057
59.905717
0.568101
37.960486
0.27568
27.567976
0.314597
8.612975
0.438031
13.720573
0.634142
59.349143
false
false
2024-10-07
0
Removed
gmonsoon_SahabatAI-Llama-11B-Test_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gmonsoon/SahabatAI-Llama-11B-Test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gmonsoon/SahabatAI-Llama-11B-Test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gmonsoon__SahabatAI-Llama-11B-Test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gmonsoon/SahabatAI-Llama-11B-Test
f6340b95d6e6cf766b6de29d36ee0db373ef175b
16.265619
llama3
0
11.52
true
false
false
false
2.099944
0.337573
33.757319
0.472758
24.457264
0.030967
3.096677
0.281879
4.250559
0.400135
7.783594
0.318235
24.248301
false
false
2024-11-22
2024-11-23
1
gmonsoon/SahabatAI-Llama-11B-Test (Merge)
gmonsoon_SahabatAI-MediChatIndo-8B-v1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gmonsoon/SahabatAI-MediChatIndo-8B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gmonsoon/SahabatAI-MediChatIndo-8B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gmonsoon__SahabatAI-MediChatIndo-8B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gmonsoon/SahabatAI-MediChatIndo-8B-v1
2f7daa8eb5ad216ce9ebcd70dc77e5b44fb977b0
17.299865
llama3
0
8.03
true
false
false
true
1.353444
0.416283
41.628324
0.450883
23.6401
0.061934
6.193353
0.282718
4.362416
0.375396
4.557813
0.310755
23.417184
true
false
2024-11-19
2024-11-19
1
gmonsoon/SahabatAI-MediChatIndo-8B-v1 (Merge)
gmonsoon_SahabatAI-Rebase-8B-Test_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gmonsoon/SahabatAI-Rebase-8B-Test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gmonsoon/SahabatAI-Rebase-8B-Test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gmonsoon__SahabatAI-Rebase-8B-Test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gmonsoon/SahabatAI-Rebase-8B-Test
aef1b4c94595f3ef110d3d69724828a2fb416b5d
23.516791
0
8.03
false
false
false
true
1.30431
0.515626
51.562632
0.522961
32.002221
0.114804
11.480363
0.287752
5.033557
0.413281
11.426823
0.366356
29.595154
false
false
2024-11-21
2024-11-23
1
gmonsoon/SahabatAI-Rebase-8B-Test (Merge)
gmonsoon_StockSeaLLMs-7B-v1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/gmonsoon/StockSeaLLMs-7B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gmonsoon/StockSeaLLMs-7B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gmonsoon__StockSeaLLMs-7B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gmonsoon/StockSeaLLMs-7B-v1
2431fe5e4a3f63984c2936cf1cf68b3c7172cc20
25.093451
0
7.616
false
false
false
true
1.393781
0.459922
45.99219
0.527109
34.012625
0.196375
19.637462
0.302852
7.04698
0.421375
11.071875
0.395196
32.799572
false
false
2024-11-20
2024-11-20
1
gmonsoon/StockSeaLLMs-7B-v1 (Merge)
gmonsoon_gemma2-9b-sahabatai-v1-instruct-BaseTIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/gmonsoon/gemma2-9b-sahabatai-v1-instruct-BaseTIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gmonsoon/gemma2-9b-sahabatai-v1-instruct-BaseTIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gmonsoon__gemma2-9b-sahabatai-v1-instruct-BaseTIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gmonsoon/gemma2-9b-sahabatai-v1-instruct-BaseTIES
43296081051afe5d7a426b86a6d73104efab440b
33.804569
gemma
2
9.242
true
false
false
true
3.562969
0.737792
73.779239
0.607724
43.401342
0.199396
19.939577
0.32047
9.395973
0.477802
19.12526
0.434674
37.186022
true
false
2024-11-16
2024-11-17
1
gmonsoon/gemma2-9b-sahabatai-v1-instruct-BaseTIES (Merge)
godlikehhd_alpaca_data_full_2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_full_2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_full_2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_full_2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_full_2
703dbf51c1ae5721f1313d1af82bd18ed38c4910
16.073237
apache-2.0
0
1.544
true
false
false
false
1.473495
0.317815
31.781451
0.421695
18.44695
0.0929
9.29003
0.297819
6.375839
0.405156
9.944531
0.285406
20.600621
false
false
2025-01-14
2025-01-14
0
godlikehhd/alpaca_data_full_2
godlikehhd_alpaca_data_full_3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_full_3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_full_3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_full_3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_full_3B
8a49b5791eede0b460b64ff5bf62b26b10c698d9
21.162549
apache-2.0
0
3.086
true
false
false
false
1.698399
0.369572
36.957163
0.468419
25.169137
0.133686
13.36858
0.277685
3.691275
0.495479
21.601563
0.335688
26.187574
false
false
2025-01-15
2025-01-16
0
godlikehhd/alpaca_data_full_3B
godlikehhd_alpaca_data_ifd_max_2600_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_ifd_max_2600" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_ifd_max_2600</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_ifd_max_2600-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_ifd_max_2600
91d51354719b002d8dcaa61ab6e2f30c8a6778ca
15.169551
apache-2.0
0
1.544
true
false
false
false
1.933345
0.30425
30.42505
0.402851
15.966393
0.098943
9.89426
0.302852
7.04698
0.350865
6.391406
0.291639
21.293218
false
false
2025-01-14
2025-01-14
0
godlikehhd/alpaca_data_ifd_max_2600
godlikehhd_alpaca_data_ifd_max_2600_3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_ifd_max_2600_3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_ifd_max_2600_3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_ifd_max_2600_3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_ifd_max_2600_3B
315f9cdb7fbbaa343c45493e5153a6b5d5ed34e4
18.603035
apache-2.0
0
3.086
true
false
false
false
2.353622
0.298156
29.815556
0.462638
24.472519
0.159366
15.936556
0.272651
3.020134
0.434552
12.952344
0.32879
25.421099
false
false
2025-01-15
2025-01-16
0
godlikehhd/alpaca_data_ifd_max_2600_3B
godlikehhd_alpaca_data_ifd_me_max_5200_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_ifd_me_max_5200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_ifd_me_max_5200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_ifd_me_max_5200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_ifd_me_max_5200
8ee0cd7292a326a10be50321e5a58c0e785f04c7
16.250398
0
1.544
false
false
false
false
1.754303
0.368323
36.832272
0.415345
16.823957
0.097432
9.743202
0.291107
5.480984
0.34826
6.599219
0.298205
22.022754
false
false
2025-01-09
0
Removed
godlikehhd_alpaca_data_ifd_min_2600_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_ifd_min_2600" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_ifd_min_2600</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_ifd_min_2600-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_ifd_min_2600
67ea00ab824c427676699f28900d2851eee2f05e
16.517773
apache-2.0
0
1.544
true
false
false
false
1.767629
0.374967
37.496731
0.421905
18.611621
0.096677
9.667674
0.291946
5.592841
0.365625
6.703125
0.289312
21.034648
false
false
2025-01-15
2025-01-15
0
godlikehhd/alpaca_data_ifd_min_2600
godlikehhd_alpaca_data_ins_ans_max_5200_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_ins_ans_max_5200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_ins_ans_max_5200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_ins_ans_max_5200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_ins_ans_max_5200
fe8144d1b38ee217159f292c94ca77faf8379400
15.924441
0
1.544
false
false
false
false
1.771621
0.347865
34.786478
0.409821
16.268689
0.102719
10.271903
0.291107
5.480984
0.360167
7.620833
0.29006
21.11776
false
false
2025-01-09
0
Removed
godlikehhd_alpaca_data_ins_max_5200_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_ins_max_5200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_ins_max_5200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_ins_max_5200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_ins_max_5200
e309c8a92204e30723f71f917682cd463cf45290
15.683739
apache-2.0
0
1.544
true
false
false
false
2.112828
0.327507
32.750657
0.415507
17.540672
0.099698
9.969789
0.296141
6.152125
0.361375
6.405208
0.291556
21.283983
false
false
2025-01-09
2025-01-09
0
godlikehhd/alpaca_data_ins_max_5200
godlikehhd_alpaca_data_ins_min_2600_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_ins_min_2600" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_ins_min_2600</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_ins_min_2600-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_ins_min_2600
98f1ca459872ef11a414a2e9f40fe1a9c331c67f
16.228325
apache-2.0
0
1.544
true
false
false
false
2.120539
0.333002
33.300199
0.418735
17.536332
0.111027
11.102719
0.297819
6.375839
0.385344
8.167969
0.287982
20.886894
false
false
2025-01-14
2025-01-14
0
godlikehhd/alpaca_data_ins_min_2600
godlikehhd_alpaca_data_ins_min_5200_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_ins_min_5200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_ins_min_5200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_ins_min_5200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_ins_min_5200
1039859027710fdd5196d401422d54c55d494ac4
16.383236
apache-2.0
0
1.544
true
false
false
false
1.998622
0.336
33.599959
0.428928
18.691278
0.103474
10.347432
0.286913
4.9217
0.390552
9.085677
0.29488
21.653369
false
false
2025-01-09
2025-01-09
0
godlikehhd/alpaca_data_ins_min_5200
godlikehhd_alpaca_data_sampled_ifd_5200_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_sampled_ifd_5200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_sampled_ifd_5200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_sampled_ifd_5200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_sampled_ifd_5200
744dc68050671f42b110bc322f0ef4d71549f067
15.703649
apache-2.0
0
1.544
true
false
false
false
1.724911
0.292385
29.238532
0.403297
15.968254
0.125378
12.537764
0.308725
7.829978
0.352073
7.575781
0.289644
21.071587
false
false
2025-01-09
2025-01-09
0
godlikehhd/alpaca_data_sampled_ifd_5200
godlikehhd_alpaca_data_sampled_ifd_new_5200_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_sampled_ifd_new_5200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_sampled_ifd_new_5200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_sampled_ifd_new_5200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_sampled_ifd_new_5200
cc169308e13b6a4f030713d9eda4ec005ebf1e16
16.537781
apache-2.0
0
1.544
true
false
false
false
1.755226
0.366325
36.632469
0.417783
17.561425
0.094411
9.441088
0.293624
5.816555
0.36125
8.389583
0.29247
21.385564
false
false
2025-01-09
2025-01-09
0
godlikehhd/alpaca_data_sampled_ifd_new_5200
godlikehhd_alpaca_data_score_max_0.1_2600_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_score_max_0.1_2600" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_score_max_0.1_2600</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_score_max_0.1_2600-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_score_max_0.1_2600
04344047146d1d4f6da4ed9668717a84daf6718c
15.917241
apache-2.0
0
1.544
true
false
false
false
2.044421
0.328755
32.875548
0.425226
18.621497
0.098943
9.89426
0.291107
5.480984
0.370646
7.264063
0.292304
21.367095
false
false
2025-01-14
2025-01-14
0
godlikehhd/alpaca_data_score_max_0.1_2600
godlikehhd_alpaca_data_score_max_0.3_2600_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_score_max_0.3_2600" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_score_max_0.3_2600</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_score_max_0.3_2600-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_score_max_0.3_2600
382a503c65ab28e85ed07a4f17f8efae0b3fac4b
15.960793
apache-2.0
0
1.544
true
false
false
false
2.066964
0.337523
33.752333
0.415145
16.924621
0.103474
10.347432
0.28943
5.257271
0.375948
8.226823
0.291307
21.25628
false
false
2025-01-14
2025-01-15
0
godlikehhd/alpaca_data_score_max_0.3_2600
godlikehhd_alpaca_data_score_max_0.7_2600_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_score_max_0.7_2600" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_score_max_0.7_2600</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_score_max_0.7_2600-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_score_max_0.7_2600
68c728f78b338c7d40f565524d4e8bbd0afd2f3d
16.630379
apache-2.0
0
1.544
true
false
false
false
1.723766
0.363976
36.397647
0.418453
18.143986
0.107251
10.725076
0.303691
7.158837
0.346865
5.32474
0.298288
22.031989
false
false
2025-01-14
2025-01-14
0
godlikehhd/alpaca_data_score_max_0.7_2600
godlikehhd_alpaca_data_score_max_2500_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_score_max_2500" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_score_max_2500</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_score_max_2500-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_score_max_2500
0d1adc4306c55ac0d89f12088768a170155ae293
16.491152
apache-2.0
0
1.544
true
false
false
false
2.116895
0.356358
35.63578
0.418014
17.930588
0.095166
9.516616
0.295302
6.040268
0.362708
8.271875
0.293966
21.551788
false
false
2025-01-13
2025-01-13
0
godlikehhd/alpaca_data_score_max_2500
godlikehhd_alpaca_data_score_max_2600_3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_score_max_2600_3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_score_max_2600_3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_score_max_2600_3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_score_max_2600_3B
45fa74597fa81030ef31d9be0776cb93f2a0e1c6
19.567091
apache-2.0
0
3.086
true
false
false
false
2.060019
0.335775
33.577463
0.471631
26.009267
0.154834
15.483384
0.265101
2.013423
0.447448
14.297656
0.334192
26.02135
false
false
2025-01-15
2025-01-15
0
godlikehhd/alpaca_data_score_max_2600_3B
godlikehhd_alpaca_data_score_max_5200_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/alpaca_data_score_max_5200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/alpaca_data_score_max_5200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__alpaca_data_score_max_5200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/alpaca_data_score_max_5200
71f132308c6e915db3be25cb52c8364baee03246
16.366038
apache-2.0
0
1.544
true
false
false
false
2.11706
0.344542
34.454248
0.424171
18.575115
0.097432
9.743202
0.297819
6.375839
0.387792
7.440625
0.294465
21.607196
false
false
2025-01-10
2025-01-10
0
godlikehhd/alpaca_data_score_max_5200