sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
listlengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
listlengths
0
25
languages
listlengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
listlengths
0
352
processed_texts
listlengths
1
353
tokens_length
listlengths
1
353
input_texts
listlengths
1
40
ec4fb2504a4f44ea209a13a048c469007a2bb368
# Dataset Card for "llama2-healthcare-guanaco" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
JayChauhan99/llama2-healthcare-guanaco
[ "region:us" ]
2023-12-08T02:34:08+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 469682, "num_examples": 340}], "download_size": 255083, "dataset_size": 469682}}
2023-12-08T05:38:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for "llama2-healthcare-guanaco" More Information needed
[ "# Dataset Card for \"llama2-healthcare-guanaco\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"llama2-healthcare-guanaco\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"llama2-healthcare-guanaco\"\n\nMore Information needed" ]
613624fabe364f85377118d6723bced3b72724da
Use the Edit dataset card button to edit.
laion/meta-imagine-dataset
[ "region:us" ]
2023-12-08T02:37:03+00:00
{"dataset_info": {"features": [{"name": "caption", "dtype": "string"}, {"name": "image", "dtype": "image"}, {"name": "link", "dtype": "string"}, {"name": "message_id", "dtype": "string"}, {"name": "timestamp", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 0, "num_examples": 0}], "download_size": 0, "dataset_size": 0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-08T03:14:11+00:00
[]
[]
TAGS #region-us
Use the Edit dataset card button to edit.
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
78fd9eb4d72b120647d8e1aff3ad48701883a794
# Dataset Card for Evaluation run of athirdpath/Iambe-20b-DARE-v2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/athirdpath/Iambe-20b-DARE-v2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [athirdpath/Iambe-20b-DARE-v2](https://huggingface.co/athirdpath/Iambe-20b-DARE-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_athirdpath__Iambe-20b-DARE-v2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-08T02:48:17.586217](https://huggingface.co/datasets/open-llm-leaderboard/details_athirdpath__Iambe-20b-DARE-v2/blob/main/results_2023-12-08T02-48-17.586217.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6035804809886537, "acc_stderr": 0.03294194113186395, "acc_norm": 0.608982387572558, "acc_norm_stderr": 0.0336160701060513, "mc1": 0.390452876376989, "mc1_stderr": 0.01707823074343145, "mc2": 0.5385363923413744, "mc2_stderr": 0.01567101081137168 }, "harness|arc:challenge|25": { "acc": 0.6023890784982935, "acc_stderr": 0.014301752223279538, "acc_norm": 0.6279863481228669, "acc_norm_stderr": 0.014124597881844456 }, "harness|hellaswag|10": { "acc": 0.6562437761402111, "acc_stderr": 0.004739902411944536, "acc_norm": 0.8453495319657439, "acc_norm_stderr": 0.0036083220651418873 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5037037037037037, "acc_stderr": 0.04319223625811331, "acc_norm": 0.5037037037037037, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.618421052631579, "acc_stderr": 0.03953173377749194, "acc_norm": 0.618421052631579, "acc_norm_stderr": 0.03953173377749194 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6339622641509434, "acc_stderr": 0.029647813539365245, "acc_norm": 0.6339622641509434, "acc_norm_stderr": 0.029647813539365245 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7013888888888888, "acc_stderr": 0.03827052357950756, "acc_norm": 0.7013888888888888, "acc_norm_stderr": 0.03827052357950756 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5317919075144508, "acc_stderr": 0.038047497443647646, "acc_norm": 0.5317919075144508, "acc_norm_stderr": 0.038047497443647646 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.045766654032077615, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.045766654032077615 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.49361702127659574, "acc_stderr": 0.032683358999363366, "acc_norm": 0.49361702127659574, "acc_norm_stderr": 0.032683358999363366 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.04339138322579861, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.04339138322579861 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3412698412698413, "acc_stderr": 0.024419234966819067, "acc_norm": 0.3412698412698413, "acc_norm_stderr": 0.024419234966819067 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3888888888888889, "acc_stderr": 0.04360314860077459, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.04360314860077459 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7161290322580646, "acc_stderr": 0.02564938106302926, "acc_norm": 0.7161290322580646, "acc_norm_stderr": 0.02564938106302926 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7212121212121212, "acc_stderr": 0.035014387062967806, "acc_norm": 0.7212121212121212, "acc_norm_stderr": 0.035014387062967806 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7424242424242424, "acc_stderr": 0.031156269519646836, "acc_norm": 0.7424242424242424, "acc_norm_stderr": 0.031156269519646836 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.02541634309630644, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.02541634309630644 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6128205128205129, "acc_stderr": 0.02469721693087894, "acc_norm": 0.6128205128205129, "acc_norm_stderr": 0.02469721693087894 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.028578348365473072, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.028578348365473072 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6302521008403361, "acc_stderr": 0.031357095996135904, "acc_norm": 0.6302521008403361, "acc_norm_stderr": 0.031357095996135904 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3973509933774834, "acc_stderr": 0.039955240076816806, "acc_norm": 0.3973509933774834, "acc_norm_stderr": 0.039955240076816806 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7761467889908257, "acc_stderr": 0.017871217767790222, "acc_norm": 0.7761467889908257, "acc_norm_stderr": 0.017871217767790222 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.027865942286639325, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.027865942286639325 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.026361651668389087, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.026361651668389087 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6771300448430493, "acc_stderr": 0.03138147637575499, "acc_norm": 0.6771300448430493, "acc_norm_stderr": 0.03138147637575499 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7022900763358778, "acc_stderr": 0.040103589424622034, "acc_norm": 0.7022900763358778, "acc_norm_stderr": 0.040103589424622034 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8148148148148148, "acc_stderr": 0.03755265865037181, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.03755265865037181 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.33035714285714285, "acc_stderr": 0.04464285714285714, "acc_norm": 0.33035714285714285, "acc_norm_stderr": 0.04464285714285714 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.04453254836326468, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.04453254836326468 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8504273504273504, "acc_stderr": 0.023365051491753715, "acc_norm": 0.8504273504273504, "acc_norm_stderr": 0.023365051491753715 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7905491698595147, "acc_stderr": 0.014551310568143705, "acc_norm": 0.7905491698595147, "acc_norm_stderr": 0.014551310568143705 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.684971098265896, "acc_stderr": 0.025009313790069716, "acc_norm": 0.684971098265896, "acc_norm_stderr": 0.025009313790069716 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.48268156424581005, "acc_stderr": 0.016712467441702523, "acc_norm": 0.48268156424581005, "acc_norm_stderr": 0.016712467441702523 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6633986928104575, "acc_stderr": 0.027057974624494382, "acc_norm": 0.6633986928104575, "acc_norm_stderr": 0.027057974624494382 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6881028938906752, "acc_stderr": 0.026311858071854155, "acc_norm": 0.6881028938906752, "acc_norm_stderr": 0.026311858071854155 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7067901234567902, "acc_stderr": 0.02532988817190092, "acc_norm": 0.7067901234567902, "acc_norm_stderr": 0.02532988817190092 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4787234042553192, "acc_stderr": 0.029800481645628693, "acc_norm": 0.4787234042553192, "acc_norm_stderr": 0.029800481645628693 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47131681877444587, "acc_stderr": 0.012749206007657476, "acc_norm": 0.47131681877444587, "acc_norm_stderr": 0.012749206007657476 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5808823529411765, "acc_stderr": 0.029972807170464626, "acc_norm": 0.5808823529411765, "acc_norm_stderr": 0.029972807170464626 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6160130718954249, "acc_stderr": 0.019675808135281515, "acc_norm": 0.6160130718954249, "acc_norm_stderr": 0.019675808135281515 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6, "acc_stderr": 0.0469237132203465, "acc_norm": 0.6, "acc_norm_stderr": 0.0469237132203465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6816326530612244, "acc_stderr": 0.029822533793982062, "acc_norm": 0.6816326530612244, "acc_norm_stderr": 0.029822533793982062 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.027403859410786848, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.027403859410786848 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.03015113445777634, "acc_norm": 0.9, "acc_norm_stderr": 0.03015113445777634 }, "harness|hendrycksTest-virology|5": { "acc": 0.4819277108433735, "acc_stderr": 0.038899512528272166, "acc_norm": 0.4819277108433735, "acc_norm_stderr": 0.038899512528272166 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7719298245614035, "acc_stderr": 0.032180937956023566, "acc_norm": 0.7719298245614035, "acc_norm_stderr": 0.032180937956023566 }, "harness|truthfulqa:mc|0": { "mc1": 0.390452876376989, "mc1_stderr": 0.01707823074343145, "mc2": 0.5385363923413744, "mc2_stderr": 0.01567101081137168 }, "harness|winogrande|5": { "acc": 0.7703235990528808, "acc_stderr": 0.011821645601838229 }, "harness|gsm8k|5": { "acc": 0.332827899924185, "acc_stderr": 0.012979892496598268 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_athirdpath__Iambe-20b-DARE-v2
[ "region:us" ]
2023-12-08T02:51:12+00:00
{"pretty_name": "Evaluation run of athirdpath/Iambe-20b-DARE-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [athirdpath/Iambe-20b-DARE-v2](https://huggingface.co/athirdpath/Iambe-20b-DARE-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_athirdpath__Iambe-20b-DARE-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T02:48:17.586217](https://huggingface.co/datasets/open-llm-leaderboard/details_athirdpath__Iambe-20b-DARE-v2/blob/main/results_2023-12-08T02-48-17.586217.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6035804809886537,\n \"acc_stderr\": 0.03294194113186395,\n \"acc_norm\": 0.608982387572558,\n \"acc_norm_stderr\": 0.0336160701060513,\n \"mc1\": 0.390452876376989,\n \"mc1_stderr\": 0.01707823074343145,\n \"mc2\": 0.5385363923413744,\n \"mc2_stderr\": 0.01567101081137168\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.014301752223279538,\n \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844456\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6562437761402111,\n \"acc_stderr\": 0.004739902411944536,\n \"acc_norm\": 0.8453495319657439,\n \"acc_norm_stderr\": 0.0036083220651418873\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n \"acc_stderr\": 0.038047497443647646,\n \"acc_norm\": 0.5317919075144508,\n \"acc_norm_stderr\": 0.038047497443647646\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819067,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819067\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n \"acc_stderr\": 0.02564938106302926,\n \"acc_norm\": 0.7161290322580646,\n \"acc_norm_stderr\": 0.02564938106302926\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646836,\n \"acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646836\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630644,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630644\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.02469721693087894,\n \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.02469721693087894\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.031357095996135904,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.031357095996135904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790222,\n \"acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790222\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389087,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389087\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326468,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326468\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7905491698595147,\n \"acc_stderr\": 0.014551310568143705,\n \"acc_norm\": 0.7905491698595147,\n \"acc_norm_stderr\": 0.014551310568143705\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069716,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069716\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48268156424581005,\n \"acc_stderr\": 0.016712467441702523,\n \"acc_norm\": 0.48268156424581005,\n \"acc_norm_stderr\": 0.016712467441702523\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.027057974624494382,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.027057974624494382\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.02532988817190092,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.02532988817190092\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657476,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657476\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.029972807170464626,\n \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.029972807170464626\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6160130718954249,\n \"acc_stderr\": 0.019675808135281515,\n \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.019675808135281515\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982062,\n \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982062\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.03015113445777634,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.03015113445777634\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n \"mc1_stderr\": 0.01707823074343145,\n \"mc2\": 0.5385363923413744,\n \"mc2_stderr\": 0.01567101081137168\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838229\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.332827899924185,\n \"acc_stderr\": 0.012979892496598268\n }\n}\n```", "repo_url": "https://huggingface.co/athirdpath/Iambe-20b-DARE-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|arc:challenge|25_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|gsm8k|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hellaswag|10_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T02-48-17.586217.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["**/details_harness|winogrande|5_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T02-48-17.586217.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T02_48_17.586217", "path": ["results_2023-12-08T02-48-17.586217.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T02-48-17.586217.parquet"]}]}]}
2023-12-08T02:51:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of athirdpath/Iambe-20b-DARE-v2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model athirdpath/Iambe-20b-DARE-v2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-08T02:48:17.586217(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of athirdpath/Iambe-20b-DARE-v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model athirdpath/Iambe-20b-DARE-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T02:48:17.586217(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of athirdpath/Iambe-20b-DARE-v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model athirdpath/Iambe-20b-DARE-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T02:48:17.586217(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of athirdpath/Iambe-20b-DARE-v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model athirdpath/Iambe-20b-DARE-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-08T02:48:17.586217(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
a509b751f7866515e1abcddd8bf6aabf59e0ffb9
# Dataset Card for Evaluation run of FPHam/Sydney_Overthinker_13b_HF ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/FPHam/Sydney_Overthinker_13b_HF - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [FPHam/Sydney_Overthinker_13b_HF](https://huggingface.co/FPHam/Sydney_Overthinker_13b_HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_FPHam__Sydney_Overthinker_13b_HF", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-08T02:51:52.068469](https://huggingface.co/datasets/open-llm-leaderboard/details_FPHam__Sydney_Overthinker_13b_HF/blob/main/results_2023-12-08T02-51-52.068469.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5136287950246363, "acc_stderr": 0.034087739996992605, "acc_norm": 0.5191163704808761, "acc_norm_stderr": 0.03483348555557295, "mc1": 0.2998776009791922, "mc1_stderr": 0.01604035296671362, "mc2": 0.45697851910783077, "mc2_stderr": 0.015427158150833389 }, "harness|arc:challenge|25": { "acc": 0.5614334470989761, "acc_stderr": 0.014500682618212865, "acc_norm": 0.5895904436860068, "acc_norm_stderr": 0.014374922192642664 }, "harness|hellaswag|10": { "acc": 0.6118303126867158, "acc_stderr": 0.004863375698153865, "acc_norm": 0.8085042820155347, "acc_norm_stderr": 0.0039267405951797715 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4888888888888889, "acc_stderr": 0.04318275491977976, "acc_norm": 0.4888888888888889, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.47368421052631576, "acc_stderr": 0.04063302731486671, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.04063302731486671 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5132075471698113, "acc_stderr": 0.030762134874500476, "acc_norm": 0.5132075471698113, "acc_norm_stderr": 0.030762134874500476 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5972222222222222, "acc_stderr": 0.04101405519842425, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.04101405519842425 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621505, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4797687861271676, "acc_stderr": 0.03809342081273957, "acc_norm": 0.4797687861271676, "acc_norm_stderr": 0.03809342081273957 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2549019607843137, "acc_stderr": 0.0433643270799318, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.0433643270799318 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4553191489361702, "acc_stderr": 0.03255525359340355, "acc_norm": 0.4553191489361702, "acc_norm_stderr": 0.03255525359340355 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2631578947368421, "acc_stderr": 0.041424397194893624, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.041424397194893624 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.46206896551724136, "acc_stderr": 0.041546596717075474, "acc_norm": 0.46206896551724136, "acc_norm_stderr": 0.041546596717075474 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2962962962962963, "acc_stderr": 0.023517294335963286, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.023517294335963286 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.35714285714285715, "acc_stderr": 0.042857142857142816, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.042857142857142816 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5870967741935483, "acc_stderr": 0.028009138125400387, "acc_norm": 0.5870967741935483, "acc_norm_stderr": 0.028009138125400387 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3399014778325123, "acc_stderr": 0.033327690684107895, "acc_norm": 0.3399014778325123, "acc_norm_stderr": 0.033327690684107895 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6181818181818182, "acc_stderr": 0.03793713171165634, "acc_norm": 0.6181818181818182, "acc_norm_stderr": 0.03793713171165634 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6616161616161617, "acc_stderr": 0.033711241426263014, "acc_norm": 0.6616161616161617, "acc_norm_stderr": 0.033711241426263014 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7409326424870466, "acc_stderr": 0.0316187791793541, "acc_norm": 0.7409326424870466, "acc_norm_stderr": 0.0316187791793541 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5230769230769231, "acc_stderr": 0.025323990861736232, "acc_norm": 0.5230769230769231, "acc_norm_stderr": 0.025323990861736232 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.28888888888888886, "acc_stderr": 0.027634907264178544, "acc_norm": 0.28888888888888886, "acc_norm_stderr": 0.027634907264178544 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.46218487394957986, "acc_stderr": 0.0323854694875898, "acc_norm": 0.46218487394957986, "acc_norm_stderr": 0.0323854694875898 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.25165562913907286, "acc_stderr": 0.035433042343899844, "acc_norm": 0.25165562913907286, "acc_norm_stderr": 0.035433042343899844 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6862385321100918, "acc_stderr": 0.019894723341469116, "acc_norm": 0.6862385321100918, "acc_norm_stderr": 0.019894723341469116 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.33796296296296297, "acc_stderr": 0.03225941352631295, "acc_norm": 0.33796296296296297, "acc_norm_stderr": 0.03225941352631295 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6323529411764706, "acc_stderr": 0.03384132045674118, "acc_norm": 0.6323529411764706, "acc_norm_stderr": 0.03384132045674118 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6708860759493671, "acc_stderr": 0.030587326294702368, "acc_norm": 0.6708860759493671, "acc_norm_stderr": 0.030587326294702368 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6412556053811659, "acc_stderr": 0.03219079200419995, "acc_norm": 0.6412556053811659, "acc_norm_stderr": 0.03219079200419995 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5954198473282443, "acc_stderr": 0.043046937953806645, "acc_norm": 0.5954198473282443, "acc_norm_stderr": 0.043046937953806645 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7107438016528925, "acc_stderr": 0.04139112727635463, "acc_norm": 0.7107438016528925, "acc_norm_stderr": 0.04139112727635463 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04557239513497751, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04557239513497751 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5705521472392638, "acc_stderr": 0.03889066619112722, "acc_norm": 0.5705521472392638, "acc_norm_stderr": 0.03889066619112722 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.6796116504854369, "acc_stderr": 0.04620284082280041, "acc_norm": 0.6796116504854369, "acc_norm_stderr": 0.04620284082280041 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7948717948717948, "acc_stderr": 0.026453508054040335, "acc_norm": 0.7948717948717948, "acc_norm_stderr": 0.026453508054040335 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.56, "acc_stderr": 0.0498887651569859, "acc_norm": 0.56, "acc_norm_stderr": 0.0498887651569859 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7203065134099617, "acc_stderr": 0.016050792148036522, "acc_norm": 0.7203065134099617, "acc_norm_stderr": 0.016050792148036522 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5809248554913294, "acc_stderr": 0.026564178111422622, "acc_norm": 0.5809248554913294, "acc_norm_stderr": 0.026564178111422622 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.32737430167597764, "acc_stderr": 0.015694238967737383, "acc_norm": 0.32737430167597764, "acc_norm_stderr": 0.015694238967737383 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5490196078431373, "acc_stderr": 0.028491993586171566, "acc_norm": 0.5490196078431373, "acc_norm_stderr": 0.028491993586171566 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6045016077170418, "acc_stderr": 0.02777091853142784, "acc_norm": 0.6045016077170418, "acc_norm_stderr": 0.02777091853142784 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6080246913580247, "acc_stderr": 0.027163686038271146, "acc_norm": 0.6080246913580247, "acc_norm_stderr": 0.027163686038271146 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3723404255319149, "acc_stderr": 0.02883892147125146, "acc_norm": 0.3723404255319149, "acc_norm_stderr": 0.02883892147125146 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3963494132985658, "acc_stderr": 0.012492830452095217, "acc_norm": 0.3963494132985658, "acc_norm_stderr": 0.012492830452095217 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4411764705882353, "acc_stderr": 0.03016191193076711, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.03016191193076711 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5163398692810458, "acc_stderr": 0.02021703065318646, "acc_norm": 0.5163398692810458, "acc_norm_stderr": 0.02021703065318646 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6090909090909091, "acc_stderr": 0.04673752333670239, "acc_norm": 0.6090909090909091, "acc_norm_stderr": 0.04673752333670239 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6081632653061224, "acc_stderr": 0.031251275910891656, "acc_norm": 0.6081632653061224, "acc_norm_stderr": 0.031251275910891656 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6467661691542289, "acc_stderr": 0.03379790611796777, "acc_norm": 0.6467661691542289, "acc_norm_stderr": 0.03379790611796777 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.42168674698795183, "acc_stderr": 0.03844453181770917, "acc_norm": 0.42168674698795183, "acc_norm_stderr": 0.03844453181770917 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7134502923976608, "acc_stderr": 0.03467826685703826, "acc_norm": 0.7134502923976608, "acc_norm_stderr": 0.03467826685703826 }, "harness|truthfulqa:mc|0": { "mc1": 0.2998776009791922, "mc1_stderr": 0.01604035296671362, "mc2": 0.45697851910783077, "mc2_stderr": 0.015427158150833389 }, "harness|winogrande|5": { "acc": 0.739542225730071, "acc_stderr": 0.012334833671998297 }, "harness|gsm8k|5": { "acc": 0.18877937831690675, "acc_stderr": 0.010779262837202751 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_FPHam__Sydney_Overthinker_13b_HF
[ "region:us" ]
2023-12-08T02:54:47+00:00
{"pretty_name": "Evaluation run of FPHam/Sydney_Overthinker_13b_HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [FPHam/Sydney_Overthinker_13b_HF](https://huggingface.co/FPHam/Sydney_Overthinker_13b_HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FPHam__Sydney_Overthinker_13b_HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T02:51:52.068469](https://huggingface.co/datasets/open-llm-leaderboard/details_FPHam__Sydney_Overthinker_13b_HF/blob/main/results_2023-12-08T02-51-52.068469.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5136287950246363,\n \"acc_stderr\": 0.034087739996992605,\n \"acc_norm\": 0.5191163704808761,\n \"acc_norm_stderr\": 0.03483348555557295,\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.01604035296671362,\n \"mc2\": 0.45697851910783077,\n \"mc2_stderr\": 0.015427158150833389\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.014500682618212865,\n \"acc_norm\": 0.5895904436860068,\n \"acc_norm_stderr\": 0.014374922192642664\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6118303126867158,\n \"acc_stderr\": 0.004863375698153865,\n \"acc_norm\": 0.8085042820155347,\n \"acc_norm_stderr\": 0.0039267405951797715\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5132075471698113,\n \"acc_stderr\": 0.030762134874500476,\n \"acc_norm\": 0.5132075471698113,\n \"acc_norm_stderr\": 0.030762134874500476\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.04101405519842425,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.04101405519842425\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.0433643270799318,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.0433643270799318\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5870967741935483,\n \"acc_stderr\": 0.028009138125400387,\n \"acc_norm\": 0.5870967741935483,\n \"acc_norm_stderr\": 0.028009138125400387\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.03793713171165634,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.03793713171165634\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6616161616161617,\n \"acc_stderr\": 0.033711241426263014,\n \"acc_norm\": 0.6616161616161617,\n \"acc_norm_stderr\": 0.033711241426263014\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7409326424870466,\n \"acc_stderr\": 0.0316187791793541,\n \"acc_norm\": 0.7409326424870466,\n \"acc_norm_stderr\": 0.0316187791793541\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5230769230769231,\n \"acc_stderr\": 0.025323990861736232,\n \"acc_norm\": 0.5230769230769231,\n \"acc_norm_stderr\": 0.025323990861736232\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.46218487394957986,\n \"acc_stderr\": 0.0323854694875898,\n \"acc_norm\": 0.46218487394957986,\n \"acc_norm_stderr\": 0.0323854694875898\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6862385321100918,\n \"acc_stderr\": 0.019894723341469116,\n \"acc_norm\": 0.6862385321100918,\n \"acc_norm_stderr\": 0.019894723341469116\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.33796296296296297,\n \"acc_stderr\": 0.03225941352631295,\n \"acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.03225941352631295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.03384132045674118,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.03384132045674118\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6708860759493671,\n \"acc_stderr\": 0.030587326294702368,\n \"acc_norm\": 0.6708860759493671,\n \"acc_norm_stderr\": 0.030587326294702368\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112722,\n \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112722\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n \"acc_stderr\": 0.026453508054040335,\n \"acc_norm\": 0.7948717948717948,\n \"acc_norm_stderr\": 0.026453508054040335\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7203065134099617,\n \"acc_stderr\": 0.016050792148036522,\n \"acc_norm\": 0.7203065134099617,\n \"acc_norm_stderr\": 0.016050792148036522\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5809248554913294,\n \"acc_stderr\": 0.026564178111422622,\n \"acc_norm\": 0.5809248554913294,\n \"acc_norm_stderr\": 0.026564178111422622\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32737430167597764,\n \"acc_stderr\": 0.015694238967737383,\n \"acc_norm\": 0.32737430167597764,\n \"acc_norm_stderr\": 0.015694238967737383\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.028491993586171566,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.028491993586171566\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n \"acc_stderr\": 0.02777091853142784,\n \"acc_norm\": 0.6045016077170418,\n \"acc_norm_stderr\": 0.02777091853142784\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3723404255319149,\n \"acc_stderr\": 0.02883892147125146,\n \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.02883892147125146\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3963494132985658,\n \"acc_stderr\": 0.012492830452095217,\n \"acc_norm\": 0.3963494132985658,\n \"acc_norm_stderr\": 0.012492830452095217\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.03016191193076711,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.03016191193076711\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02021703065318646,\n \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02021703065318646\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.01604035296671362,\n \"mc2\": 0.45697851910783077,\n \"mc2_stderr\": 0.015427158150833389\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998297\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18877937831690675,\n \"acc_stderr\": 0.010779262837202751\n }\n}\n```", "repo_url": "https://huggingface.co/FPHam/Sydney_Overthinker_13b_HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|arc:challenge|25_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|gsm8k|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hellaswag|10_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T02-51-52.068469.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["**/details_harness|winogrande|5_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T02-51-52.068469.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T02_51_52.068469", "path": ["results_2023-12-08T02-51-52.068469.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T02-51-52.068469.parquet"]}]}]}
2023-12-08T02:55:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of FPHam/Sydney_Overthinker_13b_HF ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model FPHam/Sydney_Overthinker_13b_HF on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-08T02:51:52.068469(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of FPHam/Sydney_Overthinker_13b_HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model FPHam/Sydney_Overthinker_13b_HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T02:51:52.068469(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of FPHam/Sydney_Overthinker_13b_HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model FPHam/Sydney_Overthinker_13b_HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T02:51:52.068469(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of FPHam/Sydney_Overthinker_13b_HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model FPHam/Sydney_Overthinker_13b_HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-08T02:51:52.068469(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1947a5a81ebbd83f2c831b58f308d1f11d6c6954
glove.6B.100d.txt for practice
SLU-CSCI4750/glove.6B.100d.txt
[ "region:us" ]
2023-12-08T03:05:55+00:00
{}
2023-12-08T03:08:37+00:00
[]
[]
TAGS #region-us
URL for practice
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
3b4998d357ce46d13416af9bc4edb11b4682fe2e
This is a test
ManuelAlv/test
[ "region:us" ]
2023-12-08T03:35:51+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1332053, "num_examples": 3700}, {"name": "validation", "num_bytes": 799362, "num_examples": 2220}, {"name": "test", "num_bytes": 538180, "num_examples": 1480}], "download_size": 1766713, "dataset_size": 2669595}}
2023-12-08T05:30:59+00:00
[]
[]
TAGS #region-us
This is a test
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
d27b7b59264a7a4053794fc16efc364cb2c99fc5
# Multimodel Dataset from iGeo(The INTERNATIONAL GEOGRAPHY OLYMPIAD) igeo2017🚧
LMSH7/iGeo
[ "region:us" ]
2023-12-08T03:43:47+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "options", "sequence": "string"}, {"name": "answer", "sequence": "string"}, {"name": "file_1", "dtype": "image"}, {"name": "file_2", "dtype": "image"}, {"name": "file_3", "dtype": "image"}, {"name": "file_4", "dtype": "image"}, {"name": "file_5", "dtype": "image"}, {"name": "file_6", "dtype": "image"}, {"name": "file_7", "dtype": "image"}, {"name": "file_8", "dtype": "null"}, {"name": "file_9", "dtype": "null"}, {"name": "file_10", "dtype": "null"}, {"name": "file_11", "dtype": "null"}, {"name": "description", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 13169454.0, "num_examples": 21}], "download_size": 13181176, "dataset_size": 13169454.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-08T08:48:49+00:00
[]
[]
TAGS #region-us
# Multimodel Dataset from iGeo(The INTERNATIONAL GEOGRAPHY OLYMPIAD) igeo2017
[ "# Multimodel Dataset from iGeo(The INTERNATIONAL GEOGRAPHY OLYMPIAD)\n\nigeo2017" ]
[ "TAGS\n#region-us \n", "# Multimodel Dataset from iGeo(The INTERNATIONAL GEOGRAPHY OLYMPIAD)\n\nigeo2017" ]
[ 6, 25 ]
[ "passage: TAGS\n#region-us \n# Multimodel Dataset from iGeo(The INTERNATIONAL GEOGRAPHY OLYMPIAD)\n\nigeo2017" ]
c336a8ec8e44529b7bdfd2ad8e5ea0b434c3b07f
## Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> The dataset is designed to analyze and address hate speech within online platforms. It consists of two sets: the training and testing sets. The two datasets have been labeled and categorized instances of hate speech into nine distinct categories. ## Dataset Description <!-- Provide a longer summary of what this dataset is. --> The dataset comprises three key features: tweets, labels (with hate speech denoted as 1 and non-hate speech as 0), and categories (behavior, class, disability, ethnicity, gender, physical appearance, race, religion, sexual orientation). * Training set: contains a total of 5679 tweets (Hate Speech: 1516 / Non Hate Speech: 4163), and the number of hate speech in each category is not equally distributed. * Testing set: contains a total of 1000 tweets (Hate Speech: 500 / Non Hate Speech: 500), and the number of hate speech in each category is generally even. ## Uses This dataset can be utilized for various purposes, including but not limited to: * Developing and training machine learning models for hate speech detection. * Analyzing the prevalence and patterns of hate speech across different categories. * Understanding the challenges associated with categorizing hate speech on social media platforms. Check it out for the example [project](https://github.com/Wei-Hsi/AI4health)! ## Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> The dataset utilized in this study is sourced from Kaggle and named the [Hate Speech and Offensive Language dataset](https://www.kaggle.com/datasets/mrmorj/hate-speech-and-offensive-language-dataset/). Hate speech instances are identified by selecting tweets within the "class" column. ## Annotations <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> Category labels were generated through an OpenAI API call employing the GPT-3.5 model. It's important to note the instability in category predictions when utilizing GPT-3.5 for label generation, as it tends to predict different categories each time. However, we have confirmed that these tweets were labeled correctly. If there are any misclassified labels, please feel free to reach out. Thank you in advance for your assistance. ## Dataset Card Contact Please feel free to contact me via [email protected]!
thefrankhsu/hate_speech_twitter
[ "task_categories:text-classification", "size_categories:1K<n<10K", "language:en", "health", "tweet", "hate speech", "mental health", "hate speech detection", "hate speech classification", "social media", "mobile health", "region:us" ]
2023-12-08T04:19:30+00:00
{"language": ["en"], "size_categories": ["1K<n<10K"], "task_categories": ["text-classification"], "tags": ["health", "tweet", "hate speech", "mental health", "hate speech detection", "hate speech classification", "social media", "mobile health"]}
2023-12-15T03:47:33+00:00
[]
[ "en" ]
TAGS #task_categories-text-classification #size_categories-1K<n<10K #language-English #health #tweet #hate speech #mental health #hate speech detection #hate speech classification #social media #mobile health #region-us
## Dataset Card for Dataset Name The dataset is designed to analyze and address hate speech within online platforms. It consists of two sets: the training and testing sets. The two datasets have been labeled and categorized instances of hate speech into nine distinct categories. ## Dataset Description The dataset comprises three key features: tweets, labels (with hate speech denoted as 1 and non-hate speech as 0), and categories (behavior, class, disability, ethnicity, gender, physical appearance, race, religion, sexual orientation). * Training set: contains a total of 5679 tweets (Hate Speech: 1516 / Non Hate Speech: 4163), and the number of hate speech in each category is not equally distributed. * Testing set: contains a total of 1000 tweets (Hate Speech: 500 / Non Hate Speech: 500), and the number of hate speech in each category is generally even. ## Uses This dataset can be utilized for various purposes, including but not limited to: * Developing and training machine learning models for hate speech detection. * Analyzing the prevalence and patterns of hate speech across different categories. * Understanding the challenges associated with categorizing hate speech on social media platforms. Check it out for the example project! ## Source Data The dataset utilized in this study is sourced from Kaggle and named the Hate Speech and Offensive Language dataset. Hate speech instances are identified by selecting tweets within the "class" column. ## Annotations Category labels were generated through an OpenAI API call employing the GPT-3.5 model. It's important to note the instability in category predictions when utilizing GPT-3.5 for label generation, as it tends to predict different categories each time. However, we have confirmed that these tweets were labeled correctly. If there are any misclassified labels, please feel free to reach out. Thank you in advance for your assistance. ## Dataset Card Contact Please feel free to contact me via wh476@URL!
[ "## Dataset Card for Dataset Name\n\nThe dataset is designed to analyze and address hate speech within online platforms. It consists of two sets: the training and testing sets. The two datasets have been labeled and categorized instances of hate speech into nine distinct categories.", "## Dataset Description\n\nThe dataset comprises three key features: tweets, labels (with hate speech denoted as 1 and non-hate speech as 0), and categories (behavior, class, disability, ethnicity, gender,\nphysical appearance, race, religion, sexual orientation).\n\n* Training set: contains a total of 5679 tweets (Hate Speech: 1516 / Non Hate Speech: 4163), and the number of hate speech in each category is not equally distributed.\n* Testing set: contains a total of 1000 tweets (Hate Speech: 500 / Non Hate Speech: 500), and the number of hate speech in each category is generally even.", "## Uses\nThis dataset can be utilized for various purposes, including but not limited to:\n* Developing and training machine learning models for hate speech detection.\n* Analyzing the prevalence and patterns of hate speech across different categories.\n* Understanding the challenges associated with categorizing hate speech on social media platforms.\n\nCheck it out for the example project!", "## Source Data\n\nThe dataset utilized in this study is sourced from Kaggle and named the Hate Speech and Offensive Language dataset. \nHate speech instances are identified by selecting tweets within the \"class\" column.", "## Annotations\n\nCategory labels were generated through an OpenAI API call employing the GPT-3.5 model. \nIt's important to note the instability in category predictions when utilizing GPT-3.5 for label generation, as it tends to predict different categories each time. However, we have confirmed that these tweets were labeled correctly. If there are any misclassified labels, please feel free to reach out. Thank you in advance for your assistance.", "## Dataset Card Contact\nPlease feel free to contact me via wh476@URL!" ]
[ "TAGS\n#task_categories-text-classification #size_categories-1K<n<10K #language-English #health #tweet #hate speech #mental health #hate speech detection #hate speech classification #social media #mobile health #region-us \n", "## Dataset Card for Dataset Name\n\nThe dataset is designed to analyze and address hate speech within online platforms. It consists of two sets: the training and testing sets. The two datasets have been labeled and categorized instances of hate speech into nine distinct categories.", "## Dataset Description\n\nThe dataset comprises three key features: tweets, labels (with hate speech denoted as 1 and non-hate speech as 0), and categories (behavior, class, disability, ethnicity, gender,\nphysical appearance, race, religion, sexual orientation).\n\n* Training set: contains a total of 5679 tweets (Hate Speech: 1516 / Non Hate Speech: 4163), and the number of hate speech in each category is not equally distributed.\n* Testing set: contains a total of 1000 tweets (Hate Speech: 500 / Non Hate Speech: 500), and the number of hate speech in each category is generally even.", "## Uses\nThis dataset can be utilized for various purposes, including but not limited to:\n* Developing and training machine learning models for hate speech detection.\n* Analyzing the prevalence and patterns of hate speech across different categories.\n* Understanding the challenges associated with categorizing hate speech on social media platforms.\n\nCheck it out for the example project!", "## Source Data\n\nThe dataset utilized in this study is sourced from Kaggle and named the Hate Speech and Offensive Language dataset. \nHate speech instances are identified by selecting tweets within the \"class\" column.", "## Annotations\n\nCategory labels were generated through an OpenAI API call employing the GPT-3.5 model. \nIt's important to note the instability in category predictions when utilizing GPT-3.5 for label generation, as it tends to predict different categories each time. However, we have confirmed that these tweets were labeled correctly. If there are any misclassified labels, please feel free to reach out. Thank you in advance for your assistance.", "## Dataset Card Contact\nPlease feel free to contact me via wh476@URL!" ]
[ 62, 65, 151, 78, 53, 103, 18 ]
[ "passage: TAGS\n#task_categories-text-classification #size_categories-1K<n<10K #language-English #health #tweet #hate speech #mental health #hate speech detection #hate speech classification #social media #mobile health #region-us \n## Dataset Card for Dataset Name\n\nThe dataset is designed to analyze and address hate speech within online platforms. It consists of two sets: the training and testing sets. The two datasets have been labeled and categorized instances of hate speech into nine distinct categories.## Dataset Description\n\nThe dataset comprises three key features: tweets, labels (with hate speech denoted as 1 and non-hate speech as 0), and categories (behavior, class, disability, ethnicity, gender,\nphysical appearance, race, religion, sexual orientation).\n\n* Training set: contains a total of 5679 tweets (Hate Speech: 1516 / Non Hate Speech: 4163), and the number of hate speech in each category is not equally distributed.\n* Testing set: contains a total of 1000 tweets (Hate Speech: 500 / Non Hate Speech: 500), and the number of hate speech in each category is generally even.## Uses\nThis dataset can be utilized for various purposes, including but not limited to:\n* Developing and training machine learning models for hate speech detection.\n* Analyzing the prevalence and patterns of hate speech across different categories.\n* Understanding the challenges associated with categorizing hate speech on social media platforms.\n\nCheck it out for the example project!## Source Data\n\nThe dataset utilized in this study is sourced from Kaggle and named the Hate Speech and Offensive Language dataset. \nHate speech instances are identified by selecting tweets within the \"class\" column." ]
f44b7e2944a473112fd9aad5de5bdc8e58e40cba
This is the [OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca) GPT4 subset with the original FLAN answers. Each even row (indexed starting from 0) contains the OpenOrca GPT4 answer, while each odd row contains the corresponding FLAN answer.
imone/OpenOrca_FLAN
[ "license:mit", "region:us" ]
2023-12-08T04:28:50+00:00
{"license": "mit"}
2023-12-08T04:37:54+00:00
[]
[]
TAGS #license-mit #region-us
This is the OpenOrca GPT4 subset with the original FLAN answers. Each even row (indexed starting from 0) contains the OpenOrca GPT4 answer, while each odd row contains the corresponding FLAN answer.
[]
[ "TAGS\n#license-mit #region-us \n" ]
[ 11 ]
[ "passage: TAGS\n#license-mit #region-us \n" ]
8d8b97ddfa45525f9d961597e1142493b31bf14d
# Collection of 17.154 million sentences from the December 2023 Portuguese Wikipedia dump. --- license: unknown language: - pt ---
sandro-xxx/ptwiki-sentences
[ "region:us" ]
2023-12-08T04:41:30+00:00
{}
2023-12-08T05:24:02+00:00
[]
[]
TAGS #region-us
# Collection of 17.154 million sentences from the December 2023 Portuguese Wikipedia dump. --- license: unknown language: - pt ---
[ "# Collection of 17.154 million sentences from the December 2023 Portuguese Wikipedia dump.\n\n---\nlicense: unknown\nlanguage:\n- pt\n---" ]
[ "TAGS\n#region-us \n", "# Collection of 17.154 million sentences from the December 2023 Portuguese Wikipedia dump.\n\n---\nlicense: unknown\nlanguage:\n- pt\n---" ]
[ 6, 30 ]
[ "passage: TAGS\n#region-us \n# Collection of 17.154 million sentences from the December 2023 Portuguese Wikipedia dump.\n\n---\nlicense: unknown\nlanguage:\n- pt\n---" ]
07f0f0adb7a8ac16d6299988535d2185300e371d
# Dataset Card for "platypus-templated-flat" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
sordonia/platypus-templated-flat
[ "region:us" ]
2023-12-08T05:26:51+00:00
{"dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "target", "dtype": "string"}, {"name": "task_name", "dtype": "string"}, {"name": "task_source", "dtype": "string"}, {"name": "split", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 35031259, "num_examples": 24926}], "download_size": 15986525, "dataset_size": 35031259}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-08T05:26:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for "platypus-templated-flat" More Information needed
[ "# Dataset Card for \"platypus-templated-flat\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"platypus-templated-flat\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"platypus-templated-flat\"\n\nMore Information needed" ]
a2908c0cdcbcf50dc4023547943441cc1ed4e1b2
Grounded QA dataset created from reports collected from https://www.amnesty.org/en/research/ ## Example ``` {'question': 'Which private companies in the Americas are the largest GHG emitters according to the Carbon Majors database?', 'ground_truths': ['The largest private companies in the Americas that are the largest GHG emitters according to the Carbon Majors database are ExxonMobil, Chevron, and Peabody.'], 'answer': 'According to the Carbon Majors database, the largest private companies in the Americas that are the largest GHG emitters are:\n\n1. Chevron Corporation (United States)\n2. ExxonMobil Corporation (United States)\n3. ConocoPhillips Company (United States)\n4. BP plc (United Kingdom, but with significant operations in the Americas)\n5. Royal Dutch Shell plc (Netherlands, but with significant operations in the Americas)\n6. Peabody Energy Corporation (United States)\n7. Duke Energy Corporation (United States)\n8. TotalEnergies SE (France, but with significant operations in the Americas)\n9. BHP Group Limited (Australia, but with significant operations in the Americas)\n10. Rio Tinto Group (United Kingdom/Australia, but with significant operations in the Americas)\n\nPlease note that the rankings may change over time as new data becomes available.', 'contexts': ['The private companies responsible for the most emissions during this period, according to the database, are from the United States: ExxonMobil, Chevron and Peabody.\nThe largest emitter amongst state-owned companies in the Americas is Mexican company Pemex, followed by Venezuelan company Petróleos de Venezuela, S.A.']} ``` Available languages : English, Malayalam, Hindi ## Usage Note: Only the `"eval"` split is available for this dataset. ```python from datasets import dataset malayalam_dataset = load_dataset("explodinggradients/amnesty_qa","malayalam") malayalam_dataset["eval"] ```
explodinggradients/amnesty_qa
[ "region:us" ]
2023-12-08T06:09:23+00:00
{}
2024-01-26T03:07:03+00:00
[]
[]
TAGS #region-us
Grounded QA dataset created from reports collected from URL ## Example Available languages : English, Malayalam, Hindi ## Usage Note: Only the '"eval"' split is available for this dataset.
[ "## Example\n\n\n\nAvailable languages : English, Malayalam, Hindi", "## Usage\n\nNote: Only the '\"eval\"' split is available for this dataset." ]
[ "TAGS\n#region-us \n", "## Example\n\n\n\nAvailable languages : English, Malayalam, Hindi", "## Usage\n\nNote: Only the '\"eval\"' split is available for this dataset." ]
[ 6, 12, 21 ]
[ "passage: TAGS\n#region-us \n## Example\n\n\n\nAvailable languages : English, Malayalam, Hindi## Usage\n\nNote: Only the '\"eval\"' split is available for this dataset." ]
c32a09b2135248c82adb5adb5345e5550b2d6594
# CogVLM-SFT-311K: Bilingual Visual Instruction Data in CogVLM SFT CogVLM-SFT-311K is the primary aligned corpus used in the initial training of CogVLM v1.0. The process of constructing this dataset is as follows: 1. Approximately 3500 high-quality data samples were selected from the open source [MiniGPT-4](https://huggingface.co/datasets/Vision-CAIR/cc_sbu_align), known as minigpt4-3500. 2. Minigpt4-3500 was integrated with [Llava-Instruct-150K](https://huggingface.co/datasets/liuhaotian/LLaVA-Instruct-150K) and translated into Chinese through a language model. 3. We discovered significant noise in the detailed description part of minigpt4-3500 and Llava-instruct. Thus, we corrected these Chinese corpora and retranslated them into English. ## Dataset Information The dataset contains three folders corresponding to the mixed part of minigpt4-3500 and llava, the llava solo conversation, and the multi-turn conversation datasets. Their layout is as follows: ``` .CogVLM-SFT-311K ├── llava_details-minigpt4_3500_formate ├── llava_instruction_multi_conversations_formate └── llava_instruction_single_conversation_formate ``` In our open-source data, the datasets are distributed as follows: ``` .llava_details-minigpt4_3500_formate ├── images │ └── 00000001.jpg └── labels └── 00000001.json ``` Images are stored in the images folder, while labels containing the corresponding image description or dialogue are stored in the labels folder. ## Dataset Quantity + llava_details-minigpt4_3500_formate: 22,464 images and descriptions + llava_instruction_multi_conversations_formate: 56,673 images and multi-turn conversations + llava_instruction_single_conversation_formate: 76,634 images and single-turn conversations Dataset Format Caption format for image description ``` { "captions": [ { "role": "caption", "content": "The photograph features a beach scene with a group of people surfing in the ocean. There are ten individuals spaced out in the water, sitting or lying on various sizes of body boards. The surfers line up in a row, close to the sandy shoreline. Some of the body boards are visible floating on the surface of the water, with the surfers riding on them or nearby." } ] } ``` Conversation format for image dialogue ``` { "conversations": [ { "role": "user", "content": "What can be inferred about the zebras' behavior and surroundings?" }, { "role": "assistant", "content": "Based on the image, we can infer that the two zebras are likely seeking relief from the sun's heat, as they are standing side by side under the branches of a thorny tree. This shade-providing tree offers some respite from the sun, possibly during the hottest part of the day. The zebras are in a green field with grass, providing them with an ideal environment to graze and eat while staying near their source of shelter. This shows that the zebras' behavior is influenced by the conditions and available resources in their surroundings. It also highlights that these animals adopt strategies to adapt to the fluctuating conditions of their environment, such as cooperation and seeking shelter, to survive and thrive in their natural habitat." } ] } ``` ## License + Due to non-commercial agreements, we did not use these data in the bilingual version of CogVLM or any other models involving commercialization. + The dataset license adheres to: <br> Attribution-NonCommercial 4.0 International. It should abide by the policy of OpenAI: https://openai.com/policies/terms-of-use This will not allow you to use these data for any **commercial activitiesI**. ## References This project utilizes data and concepts based on the following research papers: - Zhu, D., Chen, J., Shen, X., Li, X., & Elhoseiny, M. (2023). MiniGPT-4: Enhancing Vision-Language Understanding with Advanced Large Language Models. arXiv preprint arXiv:2304.10592. - Liu, H., Li, C., Wu, Q., & Lee, Y. J. (2023). Visual Instruction Tuning. arXiv:2304.08485.
THUDM/CogVLM-SFT-311K
[ "license:cc-by-nc-4.0", "region:us" ]
2023-12-08T06:33:26+00:00
{"license": "cc-by-nc-4.0"}
2023-12-26T10:03:17+00:00
[]
[]
TAGS #license-cc-by-nc-4.0 #region-us
# CogVLM-SFT-311K: Bilingual Visual Instruction Data in CogVLM SFT CogVLM-SFT-311K is the primary aligned corpus used in the initial training of CogVLM v1.0. The process of constructing this dataset is as follows: 1. Approximately 3500 high-quality data samples were selected from the open source MiniGPT-4, known as minigpt4-3500. 2. Minigpt4-3500 was integrated with Llava-Instruct-150K and translated into Chinese through a language model. 3. We discovered significant noise in the detailed description part of minigpt4-3500 and Llava-instruct. Thus, we corrected these Chinese corpora and retranslated them into English. ## Dataset Information The dataset contains three folders corresponding to the mixed part of minigpt4-3500 and llava, the llava solo conversation, and the multi-turn conversation datasets. Their layout is as follows: In our open-source data, the datasets are distributed as follows: Images are stored in the images folder, while labels containing the corresponding image description or dialogue are stored in the labels folder. ## Dataset Quantity + llava_details-minigpt4_3500_formate: 22,464 images and descriptions + llava_instruction_multi_conversations_formate: 56,673 images and multi-turn conversations + llava_instruction_single_conversation_formate: 76,634 images and single-turn conversations Dataset Format Caption format for image description Conversation format for image dialogue ## License + Due to non-commercial agreements, we did not use these data in the bilingual version of CogVLM or any other models involving commercialization. + The dataset license adheres to: <br> Attribution-NonCommercial 4.0 International. It should abide by the policy of OpenAI: URL This will not allow you to use these data for any commercial activitiesI. ## References This project utilizes data and concepts based on the following research papers: - Zhu, D., Chen, J., Shen, X., Li, X., & Elhoseiny, M. (2023). MiniGPT-4: Enhancing Vision-Language Understanding with Advanced Large Language Models. arXiv preprint arXiv:2304.10592. - Liu, H., Li, C., Wu, Q., & Lee, Y. J. (2023). Visual Instruction Tuning. arXiv:2304.08485.
[ "# CogVLM-SFT-311K: Bilingual Visual Instruction Data in CogVLM SFT\n\nCogVLM-SFT-311K is the primary aligned corpus used in the initial training of CogVLM v1.0. The process of constructing this dataset is as follows:\n1. Approximately 3500 high-quality data samples were selected from the open source MiniGPT-4, known as minigpt4-3500.\n2. Minigpt4-3500 was integrated with Llava-Instruct-150K and translated into Chinese through a language model.\n3. We discovered significant noise in the detailed description part of minigpt4-3500 and Llava-instruct. Thus, we corrected these Chinese corpora and retranslated them into English.", "## Dataset Information\n\nThe dataset contains three folders corresponding to the mixed part of minigpt4-3500 and llava, the llava solo conversation, and the multi-turn conversation datasets. Their layout is as follows:\n\nIn our open-source data, the datasets are distributed as follows:\n\nImages are stored in the images folder, while labels containing the corresponding image description or dialogue are stored in the labels folder.", "## Dataset Quantity\n\n+ llava_details-minigpt4_3500_formate: 22,464 images and descriptions\n+ llava_instruction_multi_conversations_formate: 56,673 images and multi-turn conversations\n+ llava_instruction_single_conversation_formate: 76,634 images and single-turn conversations\n\nDataset Format\n\nCaption format for image description\n\nConversation format for image dialogue", "## License\n\n+ Due to non-commercial agreements, we did not use these data in the bilingual version of CogVLM or any other models involving commercialization.\n+ The dataset license adheres to: <br> Attribution-NonCommercial 4.0 International. It should abide by the policy of OpenAI: URL\nThis will not allow you to use these data for any commercial activitiesI.", "## References\nThis project utilizes data and concepts based on the following research papers:\n- Zhu, D., Chen, J., Shen, X., Li, X., & Elhoseiny, M. (2023). MiniGPT-4: Enhancing Vision-Language Understanding with Advanced Large Language Models. arXiv preprint arXiv:2304.10592.\n- Liu, H., Li, C., Wu, Q., & Lee, Y. J. (2023). Visual Instruction Tuning. arXiv:2304.08485." ]
[ "TAGS\n#license-cc-by-nc-4.0 #region-us \n", "# CogVLM-SFT-311K: Bilingual Visual Instruction Data in CogVLM SFT\n\nCogVLM-SFT-311K is the primary aligned corpus used in the initial training of CogVLM v1.0. The process of constructing this dataset is as follows:\n1. Approximately 3500 high-quality data samples were selected from the open source MiniGPT-4, known as minigpt4-3500.\n2. Minigpt4-3500 was integrated with Llava-Instruct-150K and translated into Chinese through a language model.\n3. We discovered significant noise in the detailed description part of minigpt4-3500 and Llava-instruct. Thus, we corrected these Chinese corpora and retranslated them into English.", "## Dataset Information\n\nThe dataset contains three folders corresponding to the mixed part of minigpt4-3500 and llava, the llava solo conversation, and the multi-turn conversation datasets. Their layout is as follows:\n\nIn our open-source data, the datasets are distributed as follows:\n\nImages are stored in the images folder, while labels containing the corresponding image description or dialogue are stored in the labels folder.", "## Dataset Quantity\n\n+ llava_details-minigpt4_3500_formate: 22,464 images and descriptions\n+ llava_instruction_multi_conversations_formate: 56,673 images and multi-turn conversations\n+ llava_instruction_single_conversation_formate: 76,634 images and single-turn conversations\n\nDataset Format\n\nCaption format for image description\n\nConversation format for image dialogue", "## License\n\n+ Due to non-commercial agreements, we did not use these data in the bilingual version of CogVLM or any other models involving commercialization.\n+ The dataset license adheres to: <br> Attribution-NonCommercial 4.0 International. It should abide by the policy of OpenAI: URL\nThis will not allow you to use these data for any commercial activitiesI.", "## References\nThis project utilizes data and concepts based on the following research papers:\n- Zhu, D., Chen, J., Shen, X., Li, X., & Elhoseiny, M. (2023). MiniGPT-4: Enhancing Vision-Language Understanding with Advanced Large Language Models. arXiv preprint arXiv:2304.10592.\n- Liu, H., Li, C., Wu, Q., & Lee, Y. J. (2023). Visual Instruction Tuning. arXiv:2304.08485." ]
[ 17, 172, 103, 98, 90, 129 ]
[ "passage: TAGS\n#license-cc-by-nc-4.0 #region-us \n# CogVLM-SFT-311K: Bilingual Visual Instruction Data in CogVLM SFT\n\nCogVLM-SFT-311K is the primary aligned corpus used in the initial training of CogVLM v1.0. The process of constructing this dataset is as follows:\n1. Approximately 3500 high-quality data samples were selected from the open source MiniGPT-4, known as minigpt4-3500.\n2. Minigpt4-3500 was integrated with Llava-Instruct-150K and translated into Chinese through a language model.\n3. We discovered significant noise in the detailed description part of minigpt4-3500 and Llava-instruct. Thus, we corrected these Chinese corpora and retranslated them into English.## Dataset Information\n\nThe dataset contains three folders corresponding to the mixed part of minigpt4-3500 and llava, the llava solo conversation, and the multi-turn conversation datasets. Their layout is as follows:\n\nIn our open-source data, the datasets are distributed as follows:\n\nImages are stored in the images folder, while labels containing the corresponding image description or dialogue are stored in the labels folder.## Dataset Quantity\n\n+ llava_details-minigpt4_3500_formate: 22,464 images and descriptions\n+ llava_instruction_multi_conversations_formate: 56,673 images and multi-turn conversations\n+ llava_instruction_single_conversation_formate: 76,634 images and single-turn conversations\n\nDataset Format\n\nCaption format for image description\n\nConversation format for image dialogue## License\n\n+ Due to non-commercial agreements, we did not use these data in the bilingual version of CogVLM or any other models involving commercialization.\n+ The dataset license adheres to: <br> Attribution-NonCommercial 4.0 International. It should abide by the policy of OpenAI: URL\nThis will not allow you to use these data for any commercial activitiesI." ]
82d93dfce67e01a748e7e576544cc8fba1aa9a09
# Dataset Card for "context_extension-mistral-natural_distribution-16k" * 32k samples * mistral token size 1 < x < 16400 * ~natural size distribution (lots of small, few long, 1/x-like) * build with small (no filter) 1/3 + 2/3 of long (+16k), resize all +6k to (1<x<16_400) * from redpajama-v2 [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
sade-adrien/context_extension-mistral-natural_distribution-16k
[ "region:us" ]
2023-12-08T07:21:28+00:00
{"dataset_info": {"features": [{"name": "raw_content", "dtype": "string"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 6760321689, "num_examples": 32000}], "download_size": 2955669157, "dataset_size": 6760321689}}
2023-12-08T17:41:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for "context_extension-mistral-natural_distribution-16k" * 32k samples * mistral token size 1 < x < 16400 * ~natural size distribution (lots of small, few long, 1/x-like) * build with small (no filter) 1/3 + 2/3 of long (+16k), resize all +6k to (1<x<16_400) * from redpajama-v2 More Information needed
[ "# Dataset Card for \"context_extension-mistral-natural_distribution-16k\"\n* 32k samples \n* mistral token size 1 < x < 16400\n* ~natural size distribution (lots of small, few long, 1/x-like)\n* build with small (no filter) 1/3 + 2/3 of long (+16k), resize all +6k to (1<x<16_400)\n* from redpajama-v2\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"context_extension-mistral-natural_distribution-16k\"\n* 32k samples \n* mistral token size 1 < x < 16400\n* ~natural size distribution (lots of small, few long, 1/x-like)\n* build with small (no filter) 1/3 + 2/3 of long (+16k), resize all +6k to (1<x<16_400)\n* from redpajama-v2\n\nMore Information needed" ]
[ 6, 102 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"context_extension-mistral-natural_distribution-16k\"\n* 32k samples \n* mistral token size 1 < x < 16400\n* ~natural size distribution (lots of small, few long, 1/x-like)\n* build with small (no filter) 1/3 + 2/3 of long (+16k), resize all +6k to (1<x<16_400)\n* from redpajama-v2\n\nMore Information needed" ]
ca0452df0693347f4b2c1ccc74cef2893d259352
# cifar10-multirun-logits-60k This repo contains the logit outputs produced by 61,565 independently and identically trained ResNets on the CIFAR-10 test-set. To plot the histogram of accuracies across the first 500 trained models, run the following: ``` import numpy as np import matplotlib.pyplot as plt from huggingface_hub import HfApi api = HfApi() logits_path = api.hf_hub_download('kjj0/cifar10-multirun-logits-60k', repo_type='dataset', filename='logits500.npy') labels_path = api.hf_hub_download('kjj0/cifar10-multirun-logits-60k', repo_type='dataset', filename='labels.npy') logits = np.load(logits_path) labels = np.load(labels_path) acc = (logits.argmax(-1) == labels).mean(1) plt.hist(acc, bins=16, range=(0.936, 0.952)) plt.xlabel('test-set accuracy') plt.ylabel('frequency') plt.title('500 runs of training') plt.show() ``` <img src="acc500.png" alt="500 runs" width="600"/> To plot the full histogram across all 61,565 runs, replace `logits500.npy` with `logits.npy` (12GB). <img src="acc60k.png" alt="60k runs" width="600"/> ## Further detail The file `logits.npy` is an fp16 tensor of shape `(61565, 10000, 10)`, where e.g. `logits[34211, 2341, 0]` is the first logit (corresponding to the `airplane` class) predicted by the 34,211th trained model on the 2,341th example in the CIFAR-10 test-set. It was generated by using 1,000 A100-hours to run [this training script](https://github.com/KellerJordan/cifar10-loader/blob/master/example_training/train.py) 61,565 times. Each run used identical hyperparameters but with varied training stochasticity (model initialization, data order, and data augmentations). This tensor can be used to learn various pieces of information about the statistical nature of neural network training. We have extracted what seemed to be a few higher-order bits in [this paper](https://arxiv.org/abs/2304.01910). There is more to discover. The file `labels.npy` of shape `(10000,)` is the list of labels between `0` and `9` for each of the 10,000 examples. We use the same ordering of CIFAR-10 examples as `torchvision.datasets.CIFAR10`: ``` import numpy as np import torchvision from huggingface_hub import HfApi api = HfApi() path = api.hf_hub_download('kjj0/cifar10-multirun-logits-60k', repo_type='dataset', filename='labels.npy') labels = np.load(path) torchvision_labels = np.array(torchvision.datasets.CIFAR10('/tmp', train=False).targets) assert (labels == torchvision_labels).all() # True ``` So to recover the i-th example, use the following: ``` image, label = torchvision.datasets.CIFAR10('/tmp', train=False)[i] ``` --- ## New information What curve will the following code produce? ([answer](https://huggingface.co/datasets/kjj0/cifar10-multirun-logits-60k/blob/main/airplane_knn_curve.png)) ``` import numpy as np from tqdm import tqdm import matplotlib.pyplot as plt import torch from huggingface_hub import HfApi api = HfApi() logits_path = api.hf_hub_download('kjj0/cifar10-multirun-logits-60k', repo_type='dataset', filename='logits.npy') labels_path = api.hf_hub_download('kjj0/cifar10-multirun-logits-60k', repo_type='dataset', filename='labels.npy') logits0 = torch.tensor(np.load(logits_path)) labels = torch.tensor(np.load(labels_path)) mm = torch.logspace(0, 5, 20).long() mm[-1] = len(logits0) k = 15 accs = [] for m in tqdm(mm): # get airplane logits from ensemble of size m logits = logits0[:m, :, 0].cuda().float() # calculate correlations between examples logits_norm = (logits - logits.mean(0, keepdim=True)) / logits.std(0, keepdim=True) corr = (logits_norm.T @ logits_norm) / len(logits_norm) # calculate knn accuracy corr_nodiag = corr - 1000 * torch.eye(len(corr)).cuda() # remove diagonal idxs = corr_nodiag.topk(k=k, dim=1).indices.cpu() pred = labels[idxs].mode(dim=1).values acc = (pred == labels).float().mean().item() accs.append(acc) plt.plot(mm, accs) plt.xlabel('number of models in ensemble') plt.ylabel('accuracy') plt.title('knn (k=%d) accuracy on just airplane logit' % k) plt.xscale('log') plt.ylim(0, 1) plt.show() ``` ????????? ``` import numpy as np from tqdm import tqdm import torch from huggingface_hub import HfApi api = HfApi() logits_path = api.hf_hub_download('kjj0/cifar10-multirun-logits-60k', repo_type='dataset', filename='logits.npy') labels_path = api.hf_hub_download('kjj0/cifar10-multirun-logits-60k', repo_type='dataset', filename='labels.npy') logits0 = torch.tensor(np.load(logits_path)) labels = torch.tensor(np.load(labels_path)) m = 12000 # number of models logits1 = logits0[:m].float() print('ensemble accuracy:', (logits1.mean(0).argmax(1) == labels).float().mean()) logits_logsoftmax = logits1.log_softmax(-1) n = logits1.shape[1] corr_all = torch.zeros(n, n).cuda() for c in tqdm(range(10)): logits = logits_logsoftmax[:, :, c].cuda() # normalize logits -= logits.mean(0, keepdim=True) logits -= logits.mean(1, keepdim=True) logits /= logits.std(0, keepdim=True) corr = (logits.T @ logits) / len(logits) corr_all += corr corr_nodiag = corr_all - 1e9 * torch.eye(n).cuda() # remove diagonal nearest_nbr = corr_nodiag.argmax(1).cpu() assert (nearest_nbr != torch.arange(n)).all() # we're not just somehow reading out the ensemble prediction! print('kNN accuracy (k=1):', (labels[nearest_nbr] == labels).float().mean()) res = corr_nodiag.topk(k=10, dim=1) yy = F.one_hot(labels[res.indices.cpu()]).cuda() # labels of nns yy1 = yy * res.values[..., None]**4 # some kind of sparsity pred = yy1.mean(1).argmax(1).cpu() print('weighted kNN accuracy (k=10):', (pred == labels).float().mean()) ```
kjj0/cifar10-multirun-logits-60k
[ "license:mit", "arxiv:2304.01910", "region:us" ]
2023-12-08T07:37:01+00:00
{"license": "mit"}
2023-12-21T11:50:05+00:00
[ "2304.01910" ]
[]
TAGS #license-mit #arxiv-2304.01910 #region-us
# cifar10-multirun-logits-60k This repo contains the logit outputs produced by 61,565 independently and identically trained ResNets on the CIFAR-10 test-set. To plot the histogram of accuracies across the first 500 trained models, run the following: <img src="URL" alt="500 runs" width="600"/> To plot the full histogram across all 61,565 runs, replace 'URL' with 'URL' (12GB). <img src="URL" alt="60k runs" width="600"/> ## Further detail The file 'URL' is an fp16 tensor of shape '(61565, 10000, 10)', where e.g. 'logits[34211, 2341, 0]' is the first logit (corresponding to the 'airplane' class) predicted by the 34,211th trained model on the 2,341th example in the CIFAR-10 test-set. It was generated by using 1,000 A100-hours to run this training script 61,565 times. Each run used identical hyperparameters but with varied training stochasticity (model initialization, data order, and data augmentations). This tensor can be used to learn various pieces of information about the statistical nature of neural network training. We have extracted what seemed to be a few higher-order bits in this paper. There is more to discover. The file 'URL' of shape '(10000,)' is the list of labels between '0' and '9' for each of the 10,000 examples. We use the same ordering of CIFAR-10 examples as 'torchvision.datasets.CIFAR10': So to recover the i-th example, use the following: --- ## New information What curve will the following code produce? (answer) ?????????
[ "# cifar10-multirun-logits-60k\n\nThis repo contains the logit outputs produced by 61,565 independently and identically trained ResNets on the CIFAR-10 test-set.\n\nTo plot the histogram of accuracies across the first 500 trained models, run the following:\n\n\n<img src=\"URL\" alt=\"500 runs\" width=\"600\"/>\n\nTo plot the full histogram across all 61,565 runs, replace 'URL' with 'URL' (12GB).\n\n<img src=\"URL\" alt=\"60k runs\" width=\"600\"/>", "## Further detail\n\nThe file 'URL' is an fp16 tensor of shape '(61565, 10000, 10)', where e.g. 'logits[34211, 2341, 0]'\nis the first logit (corresponding to the 'airplane' class) predicted by the 34,211th trained model\non the 2,341th example in the CIFAR-10 test-set.\n\nIt was generated by using 1,000 A100-hours to run\nthis training script 61,565 times.\nEach run used identical hyperparameters but with varied training stochasticity (model initialization, data order, and data augmentations).\n\nThis tensor can be used to learn various pieces of information about the statistical nature of neural network training.\nWe have extracted what seemed to be a few higher-order bits in this paper. There is more to discover.\n\nThe file 'URL' of shape '(10000,)' is the list of labels between '0' and '9' for each of the 10,000 examples.\n\nWe use the same ordering of CIFAR-10 examples as 'torchvision.datasets.CIFAR10':\n\nSo to recover the i-th example, use the following:\n\n\n---", "## New information\n\nWhat curve will the following code produce? (answer)\n\n\n\n?????????" ]
[ "TAGS\n#license-mit #arxiv-2304.01910 #region-us \n", "# cifar10-multirun-logits-60k\n\nThis repo contains the logit outputs produced by 61,565 independently and identically trained ResNets on the CIFAR-10 test-set.\n\nTo plot the histogram of accuracies across the first 500 trained models, run the following:\n\n\n<img src=\"URL\" alt=\"500 runs\" width=\"600\"/>\n\nTo plot the full histogram across all 61,565 runs, replace 'URL' with 'URL' (12GB).\n\n<img src=\"URL\" alt=\"60k runs\" width=\"600\"/>", "## Further detail\n\nThe file 'URL' is an fp16 tensor of shape '(61565, 10000, 10)', where e.g. 'logits[34211, 2341, 0]'\nis the first logit (corresponding to the 'airplane' class) predicted by the 34,211th trained model\non the 2,341th example in the CIFAR-10 test-set.\n\nIt was generated by using 1,000 A100-hours to run\nthis training script 61,565 times.\nEach run used identical hyperparameters but with varied training stochasticity (model initialization, data order, and data augmentations).\n\nThis tensor can be used to learn various pieces of information about the statistical nature of neural network training.\nWe have extracted what seemed to be a few higher-order bits in this paper. There is more to discover.\n\nThe file 'URL' of shape '(10000,)' is the list of labels between '0' and '9' for each of the 10,000 examples.\n\nWe use the same ordering of CIFAR-10 examples as 'torchvision.datasets.CIFAR10':\n\nSo to recover the i-th example, use the following:\n\n\n---", "## New information\n\nWhat curve will the following code produce? (answer)\n\n\n\n?????????" ]
[ 20, 133, 270, 18 ]
[ "passage: TAGS\n#license-mit #arxiv-2304.01910 #region-us \n# cifar10-multirun-logits-60k\n\nThis repo contains the logit outputs produced by 61,565 independently and identically trained ResNets on the CIFAR-10 test-set.\n\nTo plot the histogram of accuracies across the first 500 trained models, run the following:\n\n\n<img src=\"URL\" alt=\"500 runs\" width=\"600\"/>\n\nTo plot the full histogram across all 61,565 runs, replace 'URL' with 'URL' (12GB).\n\n<img src=\"URL\" alt=\"60k runs\" width=\"600\"/>## Further detail\n\nThe file 'URL' is an fp16 tensor of shape '(61565, 10000, 10)', where e.g. 'logits[34211, 2341, 0]'\nis the first logit (corresponding to the 'airplane' class) predicted by the 34,211th trained model\non the 2,341th example in the CIFAR-10 test-set.\n\nIt was generated by using 1,000 A100-hours to run\nthis training script 61,565 times.\nEach run used identical hyperparameters but with varied training stochasticity (model initialization, data order, and data augmentations).\n\nThis tensor can be used to learn various pieces of information about the statistical nature of neural network training.\nWe have extracted what seemed to be a few higher-order bits in this paper. There is more to discover.\n\nThe file 'URL' of shape '(10000,)' is the list of labels between '0' and '9' for each of the 10,000 examples.\n\nWe use the same ordering of CIFAR-10 examples as 'torchvision.datasets.CIFAR10':\n\nSo to recover the i-th example, use the following:\n\n\n---## New information\n\nWhat curve will the following code produce? (answer)\n\n\n\n?????????" ]
d2a4e5b58e04e9dcdba96f3d6b930d031fb8262e
<p align="center"> 💻 <a href="https://github.com/dll-wu/Uni-Encoder" target="_blank">[Github Repo]</a> • 📃 <a href="https://arxiv.org/abs/2106.01263" target="_blank">[Paper]</a> </p> ## Overview This a collection of datasets used in the paper titled "Uni-Encoder: A Fast and Accurate Response Selection Paradigm for Generation-Based Dialogue Systems". The following datasets have been included: - Ubuntu Corpus V1 - Ubuntu Corpus V2 - PersonaChat - Douban Conv Corpus All datasets have been standardized to a unified format for research need. ## Citation ``` @inproceedings{song2023uni, title={Uni-encoder: A fast and accurate response selection paradigm for generation-based dialogue systems}, author={Song, Chiyu and He, Hongliang and Yu, Haofei and Fang, Pengfei and Cui, Leyang and Lan, Zhenzhong}, booktitle={Findings of the Association for Computational Linguistics: ACL 2023}, pages={6231--6244}, year={2023} } ```
ChiyuSONG/Uni-Encoder
[ "task_categories:conversational", "language:en", "language:zh", "license:mit", "arxiv:2106.01263", "region:us" ]
2023-12-08T08:20:34+00:00
{"language": ["en", "zh"], "license": "mit", "task_categories": ["conversational"]}
2023-12-08T12:17:48+00:00
[ "2106.01263" ]
[ "en", "zh" ]
TAGS #task_categories-conversational #language-English #language-Chinese #license-mit #arxiv-2106.01263 #region-us
<p align="center"> <a href="URL target="_blank">[Github Repo]</a> • <a href="URL target="_blank">[Paper]</a> </p> ## Overview This a collection of datasets used in the paper titled "Uni-Encoder: A Fast and Accurate Response Selection Paradigm for Generation-Based Dialogue Systems". The following datasets have been included: - Ubuntu Corpus V1 - Ubuntu Corpus V2 - PersonaChat - Douban Conv Corpus All datasets have been standardized to a unified format for research need.
[ "## Overview\nThis a collection of datasets used in the paper titled \"Uni-Encoder: A Fast and Accurate Response Selection Paradigm for Generation-Based Dialogue Systems\".\n\nThe following datasets have been included:\n - Ubuntu Corpus V1\n - Ubuntu Corpus V2\n - PersonaChat\n - Douban Conv Corpus\n\nAll datasets have been standardized to a unified format for research need." ]
[ "TAGS\n#task_categories-conversational #language-English #language-Chinese #license-mit #arxiv-2106.01263 #region-us \n", "## Overview\nThis a collection of datasets used in the paper titled \"Uni-Encoder: A Fast and Accurate Response Selection Paradigm for Generation-Based Dialogue Systems\".\n\nThe following datasets have been included:\n - Ubuntu Corpus V1\n - Ubuntu Corpus V2\n - PersonaChat\n - Douban Conv Corpus\n\nAll datasets have been standardized to a unified format for research need." ]
[ 39, 89 ]
[ "passage: TAGS\n#task_categories-conversational #language-English #language-Chinese #license-mit #arxiv-2106.01263 #region-us \n## Overview\nThis a collection of datasets used in the paper titled \"Uni-Encoder: A Fast and Accurate Response Selection Paradigm for Generation-Based Dialogue Systems\".\n\nThe following datasets have been included:\n - Ubuntu Corpus V1\n - Ubuntu Corpus V2\n - PersonaChat\n - Douban Conv Corpus\n\nAll datasets have been standardized to a unified format for research need." ]
200ad48b6abd75e26bb131d4ffeb55cb1e91c4e7
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
iamnamas/2letter-condgentext2image
[ "license:mit", "region:us" ]
2023-12-08T08:51:15+00:00
{"license": "mit", "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "caption", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4861791.8, "num_examples": 9600}], "download_size": 4927847, "dataset_size": 4861791.8}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-08T09:25:15+00:00
[]
[]
TAGS #license-mit #region-us
# Dataset Card for Dataset Name This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#license-mit #region-us \n", "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ 11, 34, 4, 40, 29, 3, 4, 9, 6, 5, 7, 4, 7, 10, 9, 5, 9, 8, 10, 46, 8, 7, 10, 5 ]
[ "passage: TAGS\n#license-mit #region-us \n# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact" ]
0b0ef1e7dc68b02cb7a15442d7524cdb22e0b8b5
# Bangla NER Dataset Data distribution training and validation: - Training data[4447] - 80% - validation data[1146] - 20% Annotation Format Example, ```sh {"text": "তাঁকে হত্যার জন্য যুবলীগের কর্মী মাসুমকে দ্রুত গ্রেপ্তার করে আইনের আওতায় আনার দাবি জানাচ্ছি।’ উপজেলা যুবলীগের আহ্বায়ক বিল্লাল হোসেন মজুমদার বলেন , ‘মাসুম যুবলীগের ঝলম ইউনিয়নের কর্মী । কী কারণে তাঁদের মধ্যে ঝামেলা হয়েছে , তা জানি না।’ বরুড়া থানার ভারপ্রাপ্ত কর্মকর্তা ( ওসি ) এ কে এম কাউছার চৌধুরী বলেন , ‘ঝলম ইউনিয়ন যুবলীগের কর্মী মাসুম ছাত্রদলের কর্মী সাদ্দামকে গুলি করেন ।", "label": [[33, 39, "PER"], [118, 139, "PER"], [147, 152, "PER"], [3, 4, "PER"], [280, 296, "PER"], [33, 37, "PER"], [353, 361, "PER"]]} {"text": "ঢাকা শিশু হাসপাতালের সাবেক উপপরিচালক , শিশু ও পুষ্টিবিশেষজ্ঞ মাহবুবুল হাসান জানান , হঠাৎ মায়ের দুধ না পেলে শিশু প্রোটিন অ্যানার্জি ম্যালনিউট্রেশন ( পিইএম ) রোগে আক্রান্ত হবেই ।", "label": [[61, 75, "PER"]]} {"text": "ফাটল ধরার পরও কারখানা খোলা থাকায় শ্রমিকেরা ২৪ এপ্রিল কাজে যোগ দেন এবং সোহেল রানা উপস্থিত থেকে কারখানা খোলা রাখার বিষয়টি তদারক করেন ।", "label": [[71, 81, "PER"]]} {"text": "সালমান খানের আইনজীবী জানিয়েছেন , দ্রুত তাঁরা এই রায়কে চ্যালেঞ্জ করে মুম্বাইয়ে উচ্চ আদালতে আপিল করছেন ।", "label": [[0, 12, "PER"]]} {"text": "তার চাচা সেমিয়ন ভেঙ্গেরভ ছিলেন একজন বিখ্যাত ফিলোলজিস্ট এবং সাহিত্য সমালোচক।", "label": [[9, 25, "PER"]]} {"text": "এলিশ ট্র্যাক তৈরির সময় বেশ কয়েকজন শিল্পীর দ্বারা অনুপ্রাণিত হওয়ার কথা স্মরণ করেন, বিশেষ করে ফ্রাঙ্ক সিনাত্রা ।", "label": [[95, 111, "PER"]]} {"text": "আনজাম ' , ' কয়লা ' , ' দিল তো পাগল হ্যায় ' ও ' দেবদাস ' ছবিতে শাহরুখের সঙ্গে অভিনয় করেছিলেন মাধুরী দীক্ষিত ।", "label": [[64, 71, "PER"], [95, 109, "PER"]]} {"text": "তিনি মোহাম্মদ বাকির আল-সদর এর ছাত্র ছিলেন।", "label": [[5, 26, "PER"]]} {"text": "নতুন বছরে জ্বলছেন আরও একজন—রজার ফেদেরার ।", "label": [[32, 38, "PER"]]} ``` Reference, 1. banglakit Bangla NER Dataset [Link](https://raw.githubusercontent.com/banglakit/bengali-ner-data/master/main.jsonl) 2. Rifat1493 Bangla NER Dataset [Link](https://github.com/Rifat1493/Bengali-NER/tree/master/Input) 3. SemEval2022 Bangla NER Dataaset [Link](https://competitions.codalab.org/competitions/36425#learn_the_details)
saiful9379/BanglaNER_V1.0
[ "license:mit", "region:us" ]
2023-12-08T09:07:37+00:00
{"license": "mit"}
2023-12-08T09:12:29+00:00
[]
[]
TAGS #license-mit #region-us
# Bangla NER Dataset Data distribution training and validation: - Training data[4447] - 80% - validation data[1146] - 20% Annotation Format Example, Reference, 1. banglakit Bangla NER Dataset Link 2. Rifat1493 Bangla NER Dataset Link 3. SemEval2022 Bangla NER Dataaset Link
[ "# Bangla NER Dataset\n\nData distribution training and validation:\n - Training data[4447] - 80%\n - validation data[1146] - 20%\n\nAnnotation Format Example,\n\n\n\nReference,\n\n1. banglakit Bangla NER Dataset Link\n2. Rifat1493 Bangla NER Dataset Link \n3. SemEval2022 Bangla NER Dataaset Link" ]
[ "TAGS\n#license-mit #region-us \n", "# Bangla NER Dataset\n\nData distribution training and validation:\n - Training data[4447] - 80%\n - validation data[1146] - 20%\n\nAnnotation Format Example,\n\n\n\nReference,\n\n1. banglakit Bangla NER Dataset Link\n2. Rifat1493 Bangla NER Dataset Link \n3. SemEval2022 Bangla NER Dataaset Link" ]
[ 11, 72 ]
[ "passage: TAGS\n#license-mit #region-us \n# Bangla NER Dataset\n\nData distribution training and validation:\n - Training data[4447] - 80%\n - validation data[1146] - 20%\n\nAnnotation Format Example,\n\n\n\nReference,\n\n1. banglakit Bangla NER Dataset Link\n2. Rifat1493 Bangla NER Dataset Link \n3. SemEval2022 Bangla NER Dataaset Link" ]
6958e344ebd7e89ac490a304e17fa9efd0143dff
### What is this? This is a translated verison of approximately 1000 instructions into 14 langauges form UltraChat 200K https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k for various language group. We claim no copyright under any of this work. The purpose of this dataset is to permit testing redteaming safety instrucitons with generally helpful instructions. See the datacard for Ultrachat for more details. Please note, we used the <n> token to designate newlines during translations and then converted these to new lines aftewards. If there are any mistakes in this process where we still have artefacts such as "< n>" in the instructions, please let us know.
aurora-m/ultrachat_multi
[ "license:mit", "region:us" ]
2023-12-08T09:15:48+00:00
{"license": "mit"}
2024-02-12T23:58:23+00:00
[]
[]
TAGS #license-mit #region-us
### What is this? This is a translated verison of approximately 1000 instructions into 14 langauges form UltraChat 200K URL for various language group. We claim no copyright under any of this work. The purpose of this dataset is to permit testing redteaming safety instrucitons with generally helpful instructions. See the datacard for Ultrachat for more details. Please note, we used the <n> token to designate newlines during translations and then converted these to new lines aftewards. If there are any mistakes in this process where we still have artefacts such as "< n>" in the instructions, please let us know.
[ "### What is this?\nThis is a translated verison of approximately 1000 instructions into 14 langauges form UltraChat 200K URL for various language group. We claim no copyright under any of this work. \nThe purpose of this dataset is to permit testing redteaming safety instrucitons with generally helpful instructions.\n\nSee the datacard for Ultrachat for more details.\n\nPlease note, we used the <n> token to designate newlines during translations and then converted these to new lines aftewards. If there are any mistakes in this process where we still have artefacts such as \"< n>\" in the instructions, please let us know." ]
[ "TAGS\n#license-mit #region-us \n", "### What is this?\nThis is a translated verison of approximately 1000 instructions into 14 langauges form UltraChat 200K URL for various language group. We claim no copyright under any of this work. \nThe purpose of this dataset is to permit testing redteaming safety instrucitons with generally helpful instructions.\n\nSee the datacard for Ultrachat for more details.\n\nPlease note, we used the <n> token to designate newlines during translations and then converted these to new lines aftewards. If there are any mistakes in this process where we still have artefacts such as \"< n>\" in the instructions, please let us know." ]
[ 11, 141 ]
[ "passage: TAGS\n#license-mit #region-us \n### What is this?\nThis is a translated verison of approximately 1000 instructions into 14 langauges form UltraChat 200K URL for various language group. We claim no copyright under any of this work. \nThe purpose of this dataset is to permit testing redteaming safety instrucitons with generally helpful instructions.\n\nSee the datacard for Ultrachat for more details.\n\nPlease note, we used the <n> token to designate newlines during translations and then converted these to new lines aftewards. If there are any mistakes in this process where we still have artefacts such as \"< n>\" in the instructions, please let us know." ]
1954cdac72efc5cee584e70caad305187d3b1b2c
# Dataset Card for bi-matrix/nl2json This is a learning data set for converting from natural language to json requests for the i-META solution developed by bimatrix. - **Homepage :** [https://www.bimatrix.co.kr/](https://www.bimatrix.co.kr/) ### Data Fields The data fields are the same among all configurations: - `query` (`str`): This is a natural language query spoken by a person. - `response` (`str`): This is the json format for requests to i-META solutions. ### Data Instances An example looks as follows: ``` { "query": "강사별 전문분야와 급여를 알려줘", "request": "{ 'dimension': ['강사명', '전문분야'], 'measure': [{'name': '급여'}], 'filters': [], 'type': ['list'] }" } ``` ### How to load ``` from datasets import load_dataset ds = load_dataset("bi-matrix/nl2json", "nl2json") ``` ### Data Splits All configurations contain a single `train` split.
bi-matrix/nl2json
[ "task_categories:text-generation", "task_categories:text2text-generation", "language:en", "language:ko", "license:cc-by-nc-nd-4.0", "region:us" ]
2023-12-08T09:49:41+00:00
{"language": ["en", "ko"], "license": "cc-by-nc-nd-4.0", "task_categories": ["text-generation", "text2text-generation"]}
2023-12-22T05:50:25+00:00
[]
[ "en", "ko" ]
TAGS #task_categories-text-generation #task_categories-text2text-generation #language-English #language-Korean #license-cc-by-nc-nd-4.0 #region-us
# Dataset Card for bi-matrix/nl2json This is a learning data set for converting from natural language to json requests for the i-META solution developed by bimatrix. - Homepage : URL ### Data Fields The data fields are the same among all configurations: - 'query' ('str'): This is a natural language query spoken by a person. - 'response' ('str'): This is the json format for requests to i-META solutions. ### Data Instances An example looks as follows: ### How to load ### Data Splits All configurations contain a single 'train' split.
[ "# Dataset Card for bi-matrix/nl2json\n\nThis is a learning data set for converting from natural language to json requests for the i-META solution developed by bimatrix.\n- Homepage : URL", "### Data Fields\nThe data fields are the same among all configurations:\n- 'query' ('str'): This is a natural language query spoken by a person.\n- 'response' ('str'): This is the json format for requests to i-META solutions.", "### Data Instances\nAn example looks as follows:", "### How to load", "### Data Splits\nAll configurations contain a single 'train' split." ]
[ "TAGS\n#task_categories-text-generation #task_categories-text2text-generation #language-English #language-Korean #license-cc-by-nc-nd-4.0 #region-us \n", "# Dataset Card for bi-matrix/nl2json\n\nThis is a learning data set for converting from natural language to json requests for the i-META solution developed by bimatrix.\n- Homepage : URL", "### Data Fields\nThe data fields are the same among all configurations:\n- 'query' ('str'): This is a natural language query spoken by a person.\n- 'response' ('str'): This is the json format for requests to i-META solutions.", "### Data Instances\nAn example looks as follows:", "### How to load", "### Data Splits\nAll configurations contain a single 'train' split." ]
[ 52, 48, 66, 13, 5, 17 ]
[ "passage: TAGS\n#task_categories-text-generation #task_categories-text2text-generation #language-English #language-Korean #license-cc-by-nc-nd-4.0 #region-us \n# Dataset Card for bi-matrix/nl2json\n\nThis is a learning data set for converting from natural language to json requests for the i-META solution developed by bimatrix.\n- Homepage : URL### Data Fields\nThe data fields are the same among all configurations:\n- 'query' ('str'): This is a natural language query spoken by a person.\n- 'response' ('str'): This is the json format for requests to i-META solutions.### Data Instances\nAn example looks as follows:### How to load### Data Splits\nAll configurations contain a single 'train' split." ]
81eb21a7b17e150051f167bcea5c6839ca5b5d71
## Dataset Description This dataset is a collection of visualizations of [Factorio Blueprints](https://wiki.factorio.com/Blueprint) using this Factorio Visualization Tool: https://github.com/piebro/factorio-blueprint-visualizer. The Blueprints are collected from https://www.factorio.school/. ## Examples ![](png_1024x1024/image_38.png) ![](png_1024x1024/image_39.png) ## Dataset Structure * "svg_original": The svg downloaded like this from the website * "svg_rect": The svg reshaped to a rect and a slightly bigger border * "png_1024x1024": The svg_rect images exported as pngs ## Additional Information The dataset was used to train this lora: https://huggingface.co/piebro/factorio-blueprint-visualizations-sdxl-lora ## Code Attachments Code to create the rectangular svgs: ```python import os import xml.etree.ElementTree as ET def modify_svg(save_dir, svg_file_path): tree = ET.parse(svg_file_path) root = tree.getroot() # Extract current width and height width = float(root.attrib['width'].replace('mm', '')) height = float(root.attrib['height'].replace('mm', '')) # Calculate new dimensions new_size = max(width, height) + 200 # Update width and height root.attrib['width'] = f"{new_size}mm" root.attrib['height'] = f"{new_size}mm" # Adjust viewBox for centering content view_box = root.attrib.get('viewBox', '').split(',') if len(view_box) == 4: x, y, vw, vh = map(float, view_box) dx = vw*0.12 dy = vh*0.12 root.attrib['viewBox'] = f"{x-dx/2}, {y-dy/2}, {vw+dx}, {vh+dy}" # Write back to file or a new file tree.write(os.path.join(save_dir, f"modified_{os.path.basename(svg_file_path)}")) save_dir = "" original_svg_folder_path = "" for file_name in os.listdir(original_svg_folder_path): if file_name.endswith('.svg'): modify_svg(save_dir, os.path.join(original_svg_folder_path, file_name)) ``` Code to create the pngs: ```bash mkdir pngs for file in *.svg; do convert "$file" -resize 1024x1024 "pngs/${file%.svg}.png"; done ```
piebro/factorio-blueprint-visualizations
[ "task_categories:text-to-image", "size_categories:n<1K", "license:cc0-1.0", "art", "region:us" ]
2023-12-08T10:34:03+00:00
{"license": "cc0-1.0", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "pretty_name": "Factorio Blueprint Visualizations Dataset", "tags": ["art"]}
2023-12-08T16:09:58+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-cc0-1.0 #art #region-us
## Dataset Description This dataset is a collection of visualizations of Factorio Blueprints using this Factorio Visualization Tool: URL The Blueprints are collected from URL ## Examples ![](png_1024x1024/image_38.png) ![](png_1024x1024/image_39.png) ## Dataset Structure * "svg_original": The svg downloaded like this from the website * "svg_rect": The svg reshaped to a rect and a slightly bigger border * "png_1024x1024": The svg_rect images exported as pngs ## Additional Information The dataset was used to train this lora: URL ## Code Attachments Code to create the rectangular svgs: Code to create the pngs:
[ "## Dataset Description\n\nThis dataset is a collection of visualizations of Factorio Blueprints using this Factorio Visualization Tool: URL The Blueprints are collected from URL", "## Examples\n\n![](png_1024x1024/image_38.png)\n![](png_1024x1024/image_39.png)", "## Dataset Structure\n\n* \"svg_original\": The svg downloaded like this from the website\n* \"svg_rect\": The svg reshaped to a rect and a slightly bigger border\n* \"png_1024x1024\": The svg_rect images exported as pngs", "## Additional Information\n\nThe dataset was used to train this lora: URL", "## Code Attachments\n\nCode to create the rectangular svgs:\n\n\nCode to create the pngs:" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-cc0-1.0 #art #region-us \n", "## Dataset Description\n\nThis dataset is a collection of visualizations of Factorio Blueprints using this Factorio Visualization Tool: URL The Blueprints are collected from URL", "## Examples\n\n![](png_1024x1024/image_38.png)\n![](png_1024x1024/image_39.png)", "## Dataset Structure\n\n* \"svg_original\": The svg downloaded like this from the website\n* \"svg_rect\": The svg reshaped to a rect and a slightly bigger border\n* \"png_1024x1024\": The svg_rect images exported as pngs", "## Additional Information\n\nThe dataset was used to train this lora: URL", "## Code Attachments\n\nCode to create the rectangular svgs:\n\n\nCode to create the pngs:" ]
[ 38, 38, 37, 70, 17, 23 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-cc0-1.0 #art #region-us \n## Dataset Description\n\nThis dataset is a collection of visualizations of Factorio Blueprints using this Factorio Visualization Tool: URL The Blueprints are collected from URL## Examples\n\n![](png_1024x1024/image_38.png)\n![](png_1024x1024/image_39.png)## Dataset Structure\n\n* \"svg_original\": The svg downloaded like this from the website\n* \"svg_rect\": The svg reshaped to a rect and a slightly bigger border\n* \"png_1024x1024\": The svg_rect images exported as pngs## Additional Information\n\nThe dataset was used to train this lora: URL## Code Attachments\n\nCode to create the rectangular svgs:\n\n\nCode to create the pngs:" ]
42078a1fb27b6079eaab8cf453efd3d79a8b7bda
# CelebA-HQ-256x256 CelebA-HQ at 256x256 resolution. ## Citation <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> ```bibtex @article{DBLP:journals/corr/abs-1710-10196, title={Progressive Growing of GANs for Improved Quality, Stability, and Variation}, author={Tero Karras and Timo Aila and Samuli Laine and Jaakko Lehtinen}, year=2017, journal={CoRR}, volume={abs/1710.10196} } ```
korexyz/celeba-hq-256x256
[ "region:us" ]
2023-12-08T11:15:26+00:00
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "female", "1": "male"}}}}], "splits": [{"name": "train", "num_bytes": 2769669459.0, "num_examples": 28000}, {"name": "validation", "num_bytes": 194637196.0, "num_examples": 2000}], "download_size": 2964490639, "dataset_size": 2964306655.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]}
2023-12-08T11:27:18+00:00
[]
[]
TAGS #region-us
# CelebA-HQ-256x256 CelebA-HQ at 256x256 resolution.
[ "# CelebA-HQ-256x256\n\nCelebA-HQ at 256x256 resolution." ]
[ "TAGS\n#region-us \n", "# CelebA-HQ-256x256\n\nCelebA-HQ at 256x256 resolution." ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# CelebA-HQ-256x256\n\nCelebA-HQ at 256x256 resolution." ]
eee71f1faecf74699e38b1de0386912be0dbfc27
Preprocessed and combined dataset of the most recent "Open Legal Data" data dumps for language modeling. Intended use of this dataset is for language modeling of German legal language. The dataset combines full-text German court rulings ("cases") and full-text German laws and single norms ("laws"). ## Data sources: "cases.jsonl.gz" - 19-Oct-2022 12:11 @ https://static.openlegaldata.io/dumps/de/2022-10-18/ "laws.jsonl.gz" - 10-Dec-2020 15:36 @ https://static.openlegaldata.io/dumps/de/2020-12-10/ All data is obtained from the Open Justice e.V. (Platanenstraße 103A, 13156 Berlin, Germany) under the Open Database License (ODbL) v1.0
hyperinfer/old_cases_and_laws
[ "size_categories:100K<n<1M", "language:de", "license:odbl", "legal", "region:us" ]
2023-12-08T11:30:24+00:00
{"language": ["de"], "license": "odbl", "size_categories": ["100K<n<1M"], "pretty_name": "Processed Open Legal Data Cases + Laws Dumps", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4938014681.1, "num_examples": 277407}, {"name": "test", "num_bytes": 548668297.9, "num_examples": 30823}], "download_size": 2724000559, "dataset_size": 5486682979}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "tags": ["legal"]}
2024-01-23T14:24:57+00:00
[]
[ "de" ]
TAGS #size_categories-100K<n<1M #language-German #license-odbl #legal #region-us
Preprocessed and combined dataset of the most recent "Open Legal Data" data dumps for language modeling. Intended use of this dataset is for language modeling of German legal language. The dataset combines full-text German court rulings ("cases") and full-text German laws and single norms ("laws"). ## Data sources: "URL" - 19-Oct-2022 12:11 @ URL "URL" - 10-Dec-2020 15:36 @ URL All data is obtained from the Open Justice e.V. (Platanenstraße 103A, 13156 Berlin, Germany) under the Open Database License (ODbL) v1.0
[ "## Data sources:\n\n \"URL\" - 19-Oct-2022 12:11 @ URL\n \n \"URL\" - 10-Dec-2020 15:36 @ URL\n\nAll data is obtained from the\n\nOpen Justice e.V. (Platanenstraße 103A, 13156 Berlin, Germany)\n\nunder the Open Database License (ODbL) v1.0" ]
[ "TAGS\n#size_categories-100K<n<1M #language-German #license-odbl #legal #region-us \n", "## Data sources:\n\n \"URL\" - 19-Oct-2022 12:11 @ URL\n \n \"URL\" - 10-Dec-2020 15:36 @ URL\n\nAll data is obtained from the\n\nOpen Justice e.V. (Platanenstraße 103A, 13156 Berlin, Germany)\n\nunder the Open Database License (ODbL) v1.0" ]
[ 30, 66 ]
[ "passage: TAGS\n#size_categories-100K<n<1M #language-German #license-odbl #legal #region-us \n## Data sources:\n\n \"URL\" - 19-Oct-2022 12:11 @ URL\n \n \"URL\" - 10-Dec-2020 15:36 @ URL\n\nAll data is obtained from the\n\nOpen Justice e.V. (Platanenstraße 103A, 13156 Berlin, Germany)\n\nunder the Open Database License (ODbL) v1.0" ]
ab571ebd681e1be0640f53ba38198719fdbff2dc
# Dataset Card for "gender-DEI-data-final" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
rehanbrr/gender-DEI-data-final
[ "region:us" ]
2023-12-08T11:32:05+00:00
{"dataset_info": {"features": [{"name": "chunk_id", "dtype": "string"}, {"name": "chunk", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4703571, "num_examples": 2171}], "download_size": 2287672, "dataset_size": 4703571}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-08T11:32:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for "gender-DEI-data-final" More Information needed
[ "# Dataset Card for \"gender-DEI-data-final\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"gender-DEI-data-final\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"gender-DEI-data-final\"\n\nMore Information needed" ]
b7f85245c0f2f15de7bccc5e6a93e307f67daed3
# Dataset Card for "voxpopuli-fr-duration" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mattlc/voxpopuli-fr-duration
[ "region:us" ]
2023-12-08T11:47:36+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "audio_id", "dtype": "string"}, {"name": "language", "dtype": {"class_label": {"names": {"0": "en", "1": "de", "2": "fr", "3": "es", "4": "pl", "5": "it", "6": "ro", "7": "hu", "8": "cs", "9": "nl", "10": "fi", "11": "hr", "12": "sk", "13": "sl", "14": "et", "15": "lt", "16": "en_accented"}}}}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "raw_text", "dtype": "string"}, {"name": "normalized_text", "dtype": "string"}, {"name": "gender", "dtype": "string"}, {"name": "speaker_id", "dtype": "string"}, {"name": "is_gold_transcript", "dtype": "bool"}, {"name": "accent", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "duration", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 23745666020.0, "num_examples": 73561}, {"name": "validation", "num_bytes": 572949218.0, "num_examples": 1727}, {"name": "test", "num_bytes": 565049155.0, "num_examples": 1742}], "download_size": 24820746555, "dataset_size": 24883664393.0}}
2023-12-08T12:03:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for "voxpopuli-fr-duration" More Information needed
[ "# Dataset Card for \"voxpopuli-fr-duration\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"voxpopuli-fr-duration\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"voxpopuli-fr-duration\"\n\nMore Information needed" ]
dd01b60727728be158446c39e9893278dbca0542
# Dataset Card for "emids_qna_json.json" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ashishsr/emids_qna_json.json
[ "region:us" ]
2023-12-08T12:08:55+00:00
{"dataset_info": {"features": [{"name": "Source", "dtype": "string"}, {"name": "Question", "dtype": "string"}, {"name": "Answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 121206, "num_examples": 281}], "download_size": 0, "dataset_size": 121206}}
2023-12-08T13:09:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for "emids_qna_json.json" More Information needed
[ "# Dataset Card for \"emids_qna_json.json\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"emids_qna_json.json\"\n\nMore Information needed" ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"emids_qna_json.json\"\n\nMore Information needed" ]
e14a0f193988e9efb7ccc57b4c615418e6540f62
# Dataset Card for DBPedia 100K: Gemini Google Embedding Model 001 100K vectors from DBPedia! Embedding Model: Google's latest Embedding Model 001 -- the successor to the Gecko Models! ## Dataset Details ### Dataset Description 100K Google Embeddings -- 768 dimensions Created: December 2023 Text used for Embedding: title (string) + text (string) Embedding Model: Google's `models/embedding-001` - **Curated by:** [Nirant Kasliwal](https://nirantk.com/about) - **Funded by:** [Qdrant Gmbh](https://qdrant.tech) - **Language(s) (NLP):** English - **License:** Apache License 2.0 ## Uses This dataset is useful for benchmarking the embedding performance, testing the vectors on an existing dataset. E.g. you can compare Google and OpenAI for the same text using this dataset. ## Dataset Creation Unlike the OpenAI Embedding, this creation used "title" and "content" attribute of the embedding model along with `task_type="retrieval_document"` ```python result = genai.embed_content( model="models/embedding-001", content="Qdrant is the best vector search engine to use with Gemini", task_type="retrieval_document", title="Qdrant x Gemini", ) ``` ## Source Data This dataset is a slice of the earlier work from @KShivendu_: https://huggingface.co/datasets/KShivendu/dbpedia-entities-openai-1M/ The 1M dataset was generated from the first 1M entries of https://huggingface.co/datasets/BeIR/dbpedia-entity From those 1M, I selected 100K at random and created embedding from them. ### Recommendations The dataset is released as is, I'm not aware of biases, limitations or other risks arising from the use of embedding models and datasets. Embedding models are cryptographically secure and should not be used for security use cases. ## Dataset Card Authors - [Nirant Kasliwal](https://nirantk.com/about/) ## Dataset Card Contact Write to nirant [dot] kasliwal [at] qdrant.com if you have questions!
nirantk/dbpedia-entities-google-palm-gemini-embedding-001-100K
[ "task_categories:feature-extraction", "size_categories:10K<n<100K", "language:en", "license:apache-2.0", "region:us" ]
2023-12-08T12:51:46+00:00
{"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["feature-extraction"], "pretty_name": "DBPedia 100K: Gemini Google Embedding Model 001", "dataset_info": {"features": [{"name": "_id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "embedding", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 653564666, "num_examples": 100000}], "download_size": 671003094, "dataset_size": 653564666}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-16T15:57:45+00:00
[]
[ "en" ]
TAGS #task_categories-feature-extraction #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us
# Dataset Card for DBPedia 100K: Gemini Google Embedding Model 001 100K vectors from DBPedia! Embedding Model: Google's latest Embedding Model 001 -- the successor to the Gecko Models! ## Dataset Details ### Dataset Description 100K Google Embeddings -- 768 dimensions Created: December 2023 Text used for Embedding: title (string) + text (string) Embedding Model: Google's 'models/embedding-001' - Curated by: Nirant Kasliwal - Funded by: Qdrant Gmbh - Language(s) (NLP): English - License: Apache License 2.0 ## Uses This dataset is useful for benchmarking the embedding performance, testing the vectors on an existing dataset. E.g. you can compare Google and OpenAI for the same text using this dataset. ## Dataset Creation Unlike the OpenAI Embedding, this creation used "title" and "content" attribute of the embedding model along with 'task_type="retrieval_document"' ## Source Data This dataset is a slice of the earlier work from @KShivendu_: URL The 1M dataset was generated from the first 1M entries of URL From those 1M, I selected 100K at random and created embedding from them. ### Recommendations The dataset is released as is, I'm not aware of biases, limitations or other risks arising from the use of embedding models and datasets. Embedding models are cryptographically secure and should not be used for security use cases. ## Dataset Card Authors - Nirant Kasliwal ## Dataset Card Contact Write to nirant [dot] kasliwal [at] URL if you have questions!
[ "# Dataset Card for DBPedia 100K: Gemini Google Embedding Model 001\n\n100K vectors from DBPedia!\n\nEmbedding Model: Google's latest Embedding Model 001 -- the successor to the Gecko Models!", "## Dataset Details", "### Dataset Description\n\n100K Google Embeddings -- 768 dimensions\n\nCreated: December 2023\nText used for Embedding: title (string) + text (string)\nEmbedding Model: Google's 'models/embedding-001'\n\n\n- Curated by: Nirant Kasliwal\n- Funded by: Qdrant Gmbh\n- Language(s) (NLP): English\n- License: Apache License 2.0", "## Uses\n\nThis dataset is useful for benchmarking the embedding performance, testing the vectors on an existing dataset. E.g. you can compare Google and OpenAI for the same text using this dataset.", "## Dataset Creation\n\n\nUnlike the OpenAI Embedding, this creation used \"title\" and \"content\" attribute of the embedding model along with 'task_type=\"retrieval_document\"'", "## Source Data\nThis dataset is a slice of the earlier work from @KShivendu_: URL\n\nThe 1M dataset was generated from the first 1M entries of URL\n\nFrom those 1M, I selected 100K at random and created embedding from them.", "### Recommendations\n\nThe dataset is released as is, I'm not aware of biases, limitations or other risks arising from the use of embedding models and datasets. \nEmbedding models are cryptographically secure and should not be used for security use cases.", "## Dataset Card Authors\n\n- Nirant Kasliwal", "## Dataset Card Contact\n\nWrite to nirant [dot] kasliwal [at] URL if you have questions!" ]
[ "TAGS\n#task_categories-feature-extraction #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us \n", "# Dataset Card for DBPedia 100K: Gemini Google Embedding Model 001\n\n100K vectors from DBPedia!\n\nEmbedding Model: Google's latest Embedding Model 001 -- the successor to the Gecko Models!", "## Dataset Details", "### Dataset Description\n\n100K Google Embeddings -- 768 dimensions\n\nCreated: December 2023\nText used for Embedding: title (string) + text (string)\nEmbedding Model: Google's 'models/embedding-001'\n\n\n- Curated by: Nirant Kasliwal\n- Funded by: Qdrant Gmbh\n- Language(s) (NLP): English\n- License: Apache License 2.0", "## Uses\n\nThis dataset is useful for benchmarking the embedding performance, testing the vectors on an existing dataset. E.g. you can compare Google and OpenAI for the same text using this dataset.", "## Dataset Creation\n\n\nUnlike the OpenAI Embedding, this creation used \"title\" and \"content\" attribute of the embedding model along with 'task_type=\"retrieval_document\"'", "## Source Data\nThis dataset is a slice of the earlier work from @KShivendu_: URL\n\nThe 1M dataset was generated from the first 1M entries of URL\n\nFrom those 1M, I selected 100K at random and created embedding from them.", "### Recommendations\n\nThe dataset is released as is, I'm not aware of biases, limitations or other risks arising from the use of embedding models and datasets. \nEmbedding models are cryptographically secure and should not be used for security use cases.", "## Dataset Card Authors\n\n- Nirant Kasliwal", "## Dataset Card Contact\n\nWrite to nirant [dot] kasliwal [at] URL if you have questions!" ]
[ 42, 55, 4, 92, 47, 47, 59, 64, 12, 25 ]
[ "passage: TAGS\n#task_categories-feature-extraction #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us \n# Dataset Card for DBPedia 100K: Gemini Google Embedding Model 001\n\n100K vectors from DBPedia!\n\nEmbedding Model: Google's latest Embedding Model 001 -- the successor to the Gecko Models!## Dataset Details### Dataset Description\n\n100K Google Embeddings -- 768 dimensions\n\nCreated: December 2023\nText used for Embedding: title (string) + text (string)\nEmbedding Model: Google's 'models/embedding-001'\n\n\n- Curated by: Nirant Kasliwal\n- Funded by: Qdrant Gmbh\n- Language(s) (NLP): English\n- License: Apache License 2.0## Uses\n\nThis dataset is useful for benchmarking the embedding performance, testing the vectors on an existing dataset. E.g. you can compare Google and OpenAI for the same text using this dataset.## Dataset Creation\n\n\nUnlike the OpenAI Embedding, this creation used \"title\" and \"content\" attribute of the embedding model along with 'task_type=\"retrieval_document\"'## Source Data\nThis dataset is a slice of the earlier work from @KShivendu_: URL\n\nThe 1M dataset was generated from the first 1M entries of URL\n\nFrom those 1M, I selected 100K at random and created embedding from them.### Recommendations\n\nThe dataset is released as is, I'm not aware of biases, limitations or other risks arising from the use of embedding models and datasets. \nEmbedding models are cryptographically secure and should not be used for security use cases.## Dataset Card Authors\n\n- Nirant Kasliwal## Dataset Card Contact\n\nWrite to nirant [dot] kasliwal [at] URL if you have questions!" ]
b84a237de53a81ee7fda5d48c01d173da64f35c1
# Open-Platypus This is a modified version of the [Open-Platypus](https://huggingface.co/datasets/garage-bAInd/Open-Platypus) dataset for instruction fine-tuning of large language models using this prompt template: ``` ### Instruction: {prompt} ### Response: <leave a newline for the model to answer> ``` Check out a sample Open-Llama model [here](https://huggingface.co/mwitiderrick/open_llama_3b_chat_v_0.1) ## Usage ```python from datasets import load_dataset dataset = load_dataset("mwitiderrick/Open-Platypus", split="train[0:5000]") # to load the first 5000 samples ```
mwitiderrick/OpenPlatypus
[ "task_categories:text-generation", "task_categories:question-answering", "size_categories:10K<n<100K", "language:en", "license:apache-2.0", "datasets", "region:us" ]
2023-12-08T13:13:21+00:00
{"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation", "question-answering"], "pretty_name": "Open Platypus", "created_by": "mwitiderrick", "tags": ["datasets"], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 33428960, "num_examples": 24926}], "download_size": 15451701, "dataset_size": 33428960}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-24T05:42:51+00:00
[]
[ "en" ]
TAGS #task_categories-text-generation #task_categories-question-answering #size_categories-10K<n<100K #language-English #license-apache-2.0 #datasets #region-us
# Open-Platypus This is a modified version of the Open-Platypus dataset for instruction fine-tuning of large language models using this prompt template: Check out a sample Open-Llama model here ## Usage
[ "# Open-Platypus\nThis is a modified version of the Open-Platypus dataset for instruction fine-tuning of large language models using this\nprompt template:\n\n\nCheck out a sample Open-Llama model here", "## Usage" ]
[ "TAGS\n#task_categories-text-generation #task_categories-question-answering #size_categories-10K<n<100K #language-English #license-apache-2.0 #datasets #region-us \n", "# Open-Platypus\nThis is a modified version of the Open-Platypus dataset for instruction fine-tuning of large language models using this\nprompt template:\n\n\nCheck out a sample Open-Llama model here", "## Usage" ]
[ 57, 47, 3 ]
[ "passage: TAGS\n#task_categories-text-generation #task_categories-question-answering #size_categories-10K<n<100K #language-English #license-apache-2.0 #datasets #region-us \n# Open-Platypus\nThis is a modified version of the Open-Platypus dataset for instruction fine-tuning of large language models using this\nprompt template:\n\n\nCheck out a sample Open-Llama model here## Usage" ]
50c4eb11743e4a8c3127f9d25992c5dada23b51a
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/kyujinpy/PlatYi-34B-Llama - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-Llama](https://huggingface.co/kyujinpy/PlatYi-34B-Llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-08T13:53:50.560895](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama/blob/main/results_2023-12-08T13-53-50.560895.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7728810010749458, "acc_stderr": 0.027595526787008207, "acc_norm": 0.7819869729388714, "acc_norm_stderr": 0.028092738383065884, "mc1": 0.390452876376989, "mc1_stderr": 0.017078230743431448, "mc2": 0.5346474030714572, "mc2_stderr": 0.014932996057223041 }, "harness|arc:challenge|25": { "acc": 0.6331058020477816, "acc_stderr": 0.014084133118104294, "acc_norm": 0.6783276450511946, "acc_norm_stderr": 0.013650488084494164 }, "harness|hellaswag|10": { "acc": 0.6539533957379008, "acc_stderr": 0.0047473605007424865, "acc_norm": 0.8535152360087632, "acc_norm_stderr": 0.0035286889976580537 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7555555555555555, "acc_stderr": 0.03712537833614866, "acc_norm": 0.7555555555555555, "acc_norm_stderr": 0.03712537833614866 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8947368421052632, "acc_stderr": 0.024974533450920697, "acc_norm": 0.8947368421052632, "acc_norm_stderr": 0.024974533450920697 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.81, "acc_stderr": 0.03942772444036623, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036623 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8, "acc_stderr": 0.02461829819586651, "acc_norm": 0.8, "acc_norm_stderr": 0.02461829819586651 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.9305555555555556, "acc_stderr": 0.02125797482283204, "acc_norm": 0.9305555555555556, "acc_norm_stderr": 0.02125797482283204 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.48, "acc_stderr": 0.05021167315686779, "acc_norm": 0.48, "acc_norm_stderr": 0.05021167315686779 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7109826589595376, "acc_stderr": 0.03456425745086999, "acc_norm": 0.7109826589595376, "acc_norm_stderr": 0.03456425745086999 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5490196078431373, "acc_stderr": 0.04951218252396262, "acc_norm": 0.5490196078431373, "acc_norm_stderr": 0.04951218252396262 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.8, "acc_stderr": 0.026148818018424506, "acc_norm": 0.8, "acc_norm_stderr": 0.026148818018424506 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.6403508771929824, "acc_stderr": 0.04514496132873633, "acc_norm": 0.6403508771929824, "acc_norm_stderr": 0.04514496132873633 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.8, "acc_stderr": 0.0333333333333333, "acc_norm": 0.8, "acc_norm_stderr": 0.0333333333333333 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.746031746031746, "acc_stderr": 0.02241804289111394, "acc_norm": 0.746031746031746, "acc_norm_stderr": 0.02241804289111394 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5793650793650794, "acc_stderr": 0.04415438226743745, "acc_norm": 0.5793650793650794, "acc_norm_stderr": 0.04415438226743745 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.9225806451612903, "acc_stderr": 0.015203644420774848, "acc_norm": 0.9225806451612903, "acc_norm_stderr": 0.015203644420774848 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6995073891625616, "acc_stderr": 0.03225799476233484, "acc_norm": 0.6995073891625616, "acc_norm_stderr": 0.03225799476233484 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8787878787878788, "acc_stderr": 0.02548549837334323, "acc_norm": 0.8787878787878788, "acc_norm_stderr": 0.02548549837334323 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9090909090909091, "acc_stderr": 0.02048208677542421, "acc_norm": 0.9090909090909091, "acc_norm_stderr": 0.02048208677542421 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9740932642487047, "acc_stderr": 0.01146452335695318, "acc_norm": 0.9740932642487047, "acc_norm_stderr": 0.01146452335695318 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.8307692307692308, "acc_stderr": 0.01901100452365105, "acc_norm": 0.8307692307692308, "acc_norm_stderr": 0.01901100452365105 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.4703703703703704, "acc_stderr": 0.030431963547936584, "acc_norm": 0.4703703703703704, "acc_norm_stderr": 0.030431963547936584 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8739495798319328, "acc_stderr": 0.021559623121213928, "acc_norm": 0.8739495798319328, "acc_norm_stderr": 0.021559623121213928 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5695364238410596, "acc_stderr": 0.04042809961395634, "acc_norm": 0.5695364238410596, "acc_norm_stderr": 0.04042809961395634 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9284403669724771, "acc_stderr": 0.011051255247815462, "acc_norm": 0.9284403669724771, "acc_norm_stderr": 0.011051255247815462 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6898148148148148, "acc_stderr": 0.031546962856566295, "acc_norm": 0.6898148148148148, "acc_norm_stderr": 0.031546962856566295 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9362745098039216, "acc_stderr": 0.01714392165552496, "acc_norm": 0.9362745098039216, "acc_norm_stderr": 0.01714392165552496 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9240506329113924, "acc_stderr": 0.017244633251065702, "acc_norm": 0.9240506329113924, "acc_norm_stderr": 0.017244633251065702 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.8026905829596412, "acc_stderr": 0.02670985334496796, "acc_norm": 0.8026905829596412, "acc_norm_stderr": 0.02670985334496796 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8931297709923665, "acc_stderr": 0.027096548624883733, "acc_norm": 0.8931297709923665, "acc_norm_stderr": 0.027096548624883733 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8925619834710744, "acc_stderr": 0.028268812192540627, "acc_norm": 0.8925619834710744, "acc_norm_stderr": 0.028268812192540627 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8981481481481481, "acc_stderr": 0.02923927267563275, "acc_norm": 0.8981481481481481, "acc_norm_stderr": 0.02923927267563275 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8895705521472392, "acc_stderr": 0.024624937788941318, "acc_norm": 0.8895705521472392, "acc_norm_stderr": 0.024624937788941318 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.6428571428571429, "acc_stderr": 0.04547960999764376, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.04547960999764376 }, "harness|hendrycksTest-management|5": { "acc": 0.8737864077669902, "acc_stderr": 0.03288180278808628, "acc_norm": 0.8737864077669902, "acc_norm_stderr": 0.03288180278808628 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9401709401709402, "acc_stderr": 0.01553751426325388, "acc_norm": 0.9401709401709402, "acc_norm_stderr": 0.01553751426325388 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.88, "acc_stderr": 0.032659863237109066, "acc_norm": 0.88, "acc_norm_stderr": 0.032659863237109066 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9118773946360154, "acc_stderr": 0.010136978203312637, "acc_norm": 0.9118773946360154, "acc_norm_stderr": 0.010136978203312637 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8265895953757225, "acc_stderr": 0.020383229551135022, "acc_norm": 0.8265895953757225, "acc_norm_stderr": 0.020383229551135022 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.7284916201117319, "acc_stderr": 0.014874252168095264, "acc_norm": 0.7284916201117319, "acc_norm_stderr": 0.014874252168095264 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.869281045751634, "acc_stderr": 0.019301873624215284, "acc_norm": 0.869281045751634, "acc_norm_stderr": 0.019301873624215284 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8263665594855305, "acc_stderr": 0.02151405158597041, "acc_norm": 0.8263665594855305, "acc_norm_stderr": 0.02151405158597041 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8765432098765432, "acc_stderr": 0.01830386880689179, "acc_norm": 0.8765432098765432, "acc_norm_stderr": 0.01830386880689179 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6702127659574468, "acc_stderr": 0.0280459469420424, "acc_norm": 0.6702127659574468, "acc_norm_stderr": 0.0280459469420424 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.6258148631029987, "acc_stderr": 0.012359335618172063, "acc_norm": 0.6258148631029987, "acc_norm_stderr": 0.012359335618172063 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8492647058823529, "acc_stderr": 0.021734235515652848, "acc_norm": 0.8492647058823529, "acc_norm_stderr": 0.021734235515652848 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.826797385620915, "acc_stderr": 0.015309329266969136, "acc_norm": 0.826797385620915, "acc_norm_stderr": 0.015309329266969136 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7454545454545455, "acc_stderr": 0.041723430387053825, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.041723430387053825 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8367346938775511, "acc_stderr": 0.02366169917709861, "acc_norm": 0.8367346938775511, "acc_norm_stderr": 0.02366169917709861 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8955223880597015, "acc_stderr": 0.021628920516700637, "acc_norm": 0.8955223880597015, "acc_norm_stderr": 0.021628920516700637 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.91, "acc_stderr": 0.028762349126466125, "acc_norm": 0.91, "acc_norm_stderr": 0.028762349126466125 }, "harness|hendrycksTest-virology|5": { "acc": 0.5783132530120482, "acc_stderr": 0.038444531817709175, "acc_norm": 0.5783132530120482, "acc_norm_stderr": 0.038444531817709175 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8713450292397661, "acc_stderr": 0.025679342723276908, "acc_norm": 0.8713450292397661, "acc_norm_stderr": 0.025679342723276908 }, "harness|truthfulqa:mc|0": { "mc1": 0.390452876376989, "mc1_stderr": 0.017078230743431448, "mc2": 0.5346474030714572, "mc2_stderr": 0.014932996057223041 }, "harness|winogrande|5": { "acc": 0.8287292817679558, "acc_stderr": 0.010588417294962524 }, "harness|gsm8k|5": { "acc": 0.4245640636846095, "acc_stderr": 0.01361483557495636 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama
[ "region:us" ]
2023-12-08T13:56:40+00:00
{"pretty_name": "Evaluation run of kyujinpy/PlatYi-34B-Llama", "dataset_summary": "Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-Llama](https://huggingface.co/kyujinpy/PlatYi-34B-Llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T13:53:50.560895](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama/blob/main/results_2023-12-08T13-53-50.560895.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7728810010749458,\n \"acc_stderr\": 0.027595526787008207,\n \"acc_norm\": 0.7819869729388714,\n \"acc_norm_stderr\": 0.028092738383065884,\n \"mc1\": 0.390452876376989,\n \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.5346474030714572,\n \"mc2_stderr\": 0.014932996057223041\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6331058020477816,\n \"acc_stderr\": 0.014084133118104294,\n \"acc_norm\": 0.6783276450511946,\n \"acc_norm_stderr\": 0.013650488084494164\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6539533957379008,\n \"acc_stderr\": 0.0047473605007424865,\n \"acc_norm\": 0.8535152360087632,\n \"acc_norm_stderr\": 0.0035286889976580537\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7555555555555555,\n \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.7555555555555555,\n \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.024974533450920697,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.024974533450920697\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02461829819586651,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02461829819586651\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9305555555555556,\n \"acc_stderr\": 0.02125797482283204,\n \"acc_norm\": 0.9305555555555556,\n \"acc_norm_stderr\": 0.02125797482283204\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.03456425745086999,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.03456425745086999\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.026148818018424506,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.026148818018424506\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6403508771929824,\n \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.6403508771929824,\n \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.0333333333333333,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.0333333333333333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.746031746031746,\n \"acc_stderr\": 0.02241804289111394,\n \"acc_norm\": 0.746031746031746,\n \"acc_norm_stderr\": 0.02241804289111394\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n \"acc_stderr\": 0.04415438226743745,\n \"acc_norm\": 0.5793650793650794,\n \"acc_norm_stderr\": 0.04415438226743745\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9225806451612903,\n \"acc_stderr\": 0.015203644420774848,\n \"acc_norm\": 0.9225806451612903,\n \"acc_norm_stderr\": 0.015203644420774848\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6995073891625616,\n \"acc_stderr\": 0.03225799476233484,\n \"acc_norm\": 0.6995073891625616,\n \"acc_norm_stderr\": 0.03225799476233484\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.02548549837334323,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02548549837334323\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.02048208677542421,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02048208677542421\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8307692307692308,\n \"acc_stderr\": 0.01901100452365105,\n \"acc_norm\": 0.8307692307692308,\n \"acc_norm_stderr\": 0.01901100452365105\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4703703703703704,\n \"acc_stderr\": 0.030431963547936584,\n \"acc_norm\": 0.4703703703703704,\n \"acc_norm_stderr\": 0.030431963547936584\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8739495798319328,\n \"acc_stderr\": 0.021559623121213928,\n \"acc_norm\": 0.8739495798319328,\n \"acc_norm_stderr\": 0.021559623121213928\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5695364238410596,\n \"acc_stderr\": 0.04042809961395634,\n \"acc_norm\": 0.5695364238410596,\n \"acc_norm_stderr\": 0.04042809961395634\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9284403669724771,\n \"acc_stderr\": 0.011051255247815462,\n \"acc_norm\": 0.9284403669724771,\n \"acc_norm_stderr\": 0.011051255247815462\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6898148148148148,\n \"acc_stderr\": 0.031546962856566295,\n \"acc_norm\": 0.6898148148148148,\n \"acc_norm_stderr\": 0.031546962856566295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9362745098039216,\n \"acc_stderr\": 0.01714392165552496,\n \"acc_norm\": 0.9362745098039216,\n \"acc_norm_stderr\": 0.01714392165552496\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9240506329113924,\n \"acc_stderr\": 0.017244633251065702,\n \"acc_norm\": 0.9240506329113924,\n \"acc_norm_stderr\": 0.017244633251065702\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8931297709923665,\n \"acc_stderr\": 0.027096548624883733,\n \"acc_norm\": 0.8931297709923665,\n \"acc_norm_stderr\": 0.027096548624883733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540627,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540627\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8895705521472392,\n \"acc_stderr\": 0.024624937788941318,\n \"acc_norm\": 0.8895705521472392,\n \"acc_norm_stderr\": 0.024624937788941318\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.01553751426325388,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.01553751426325388\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9118773946360154,\n \"acc_stderr\": 0.010136978203312637,\n \"acc_norm\": 0.9118773946360154,\n \"acc_norm_stderr\": 0.010136978203312637\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8265895953757225,\n \"acc_stderr\": 0.020383229551135022,\n \"acc_norm\": 0.8265895953757225,\n \"acc_norm_stderr\": 0.020383229551135022\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7284916201117319,\n \"acc_stderr\": 0.014874252168095264,\n \"acc_norm\": 0.7284916201117319,\n \"acc_norm_stderr\": 0.014874252168095264\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.869281045751634,\n \"acc_stderr\": 0.019301873624215284,\n \"acc_norm\": 0.869281045751634,\n \"acc_norm_stderr\": 0.019301873624215284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8263665594855305,\n \"acc_stderr\": 0.02151405158597041,\n \"acc_norm\": 0.8263665594855305,\n \"acc_norm_stderr\": 0.02151405158597041\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8765432098765432,\n \"acc_stderr\": 0.01830386880689179,\n \"acc_norm\": 0.8765432098765432,\n \"acc_norm_stderr\": 0.01830386880689179\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6702127659574468,\n \"acc_stderr\": 0.0280459469420424,\n \"acc_norm\": 0.6702127659574468,\n \"acc_norm_stderr\": 0.0280459469420424\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6258148631029987,\n \"acc_stderr\": 0.012359335618172063,\n \"acc_norm\": 0.6258148631029987,\n \"acc_norm_stderr\": 0.012359335618172063\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8492647058823529,\n \"acc_stderr\": 0.021734235515652848,\n \"acc_norm\": 0.8492647058823529,\n \"acc_norm_stderr\": 0.021734235515652848\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.826797385620915,\n \"acc_stderr\": 0.015309329266969136,\n \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.015309329266969136\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.02366169917709861,\n \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.02366169917709861\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700637,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700637\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276908,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276908\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.5346474030714572,\n \"mc2_stderr\": 0.014932996057223041\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8287292817679558,\n \"acc_stderr\": 0.010588417294962524\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4245640636846095,\n \"acc_stderr\": 0.01361483557495636\n }\n}\n```", "repo_url": "https://huggingface.co/kyujinpy/PlatYi-34B-Llama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|arc:challenge|25_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|gsm8k|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hellaswag|10_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T13-53-50.560895.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["**/details_harness|winogrande|5_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T13-53-50.560895.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T13_53_50.560895", "path": ["results_2023-12-08T13-53-50.560895.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T13-53-50.560895.parquet"]}]}]}
2023-12-08T13:57:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model kyujinpy/PlatYi-34B-Llama on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-08T13:53:50.560895(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyujinpy/PlatYi-34B-Llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T13:53:50.560895(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyujinpy/PlatYi-34B-Llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T13:53:50.560895(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 171, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyujinpy/PlatYi-34B-Llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-08T13:53:50.560895(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
27fdaa50b1e2086b427b8224d40d0ba7c0faaf1b
# Dataset Card for "20NewsGroups" ``` {"label_text": ["#", "#.sci", "#.sci.electronics", "#.talk", "#.talk.religion", "#.talk.religion.misc", "#.sci.space", "#.comp", "#.comp.sys", "#.comp.sys.ibm", "#.comp.sys.ibm.pc", "#.comp.sys.ibm.pc.hardware", "#.rec", "#.rec.sport", "#.rec.sport.baseball", "#.soc", "#.soc.religion", "#.soc.religion.christian", "#.comp.windows", "#.comp.windows.x", "#.rec.sport.hockey", "#.rec.motorcycles", "#.comp.sys.mac", "#.comp.sys.mac.hardware", "#.comp.graphics", "#.talk.politics", "#.talk.politics.misc", "#.sci.med", "#.misc", "#.misc.forsale", "#.alt", "#.alt.atheism", "#.sci.crypt", "#.talk.politics.mideast", "#.talk.politics.guns", "#.comp.os", "#.comp.os.ms-windows", "#.comp.os.ms-windows.misc", "#.rec.autos"], "graph": [[0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]]} ```
aeromaki/20NewsGroups
[ "region:us" ]
2023-12-08T14:24:42+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "label", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 11378604, "num_examples": 8549}, {"name": "validation", "num_bytes": 1557031, "num_examples": 1068}, {"name": "test", "num_bytes": 1320717, "num_examples": 1047}], "download_size": 9030627, "dataset_size": 14256352}}
2023-12-08T14:39:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for "20NewsGroups"
[ "# Dataset Card for \"20NewsGroups\"" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"20NewsGroups\"" ]
[ 6, 11 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"20NewsGroups\"" ]
af05b747232e519637e66b008f81d61e15718e20
# Dataset Card for "multitiny_id_rename_changed_order" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CJWeiss/multitiny_id_rename_changed_order
[ "region:us" ]
2023-12-08T14:24:54+00:00
{"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 765198398, "num_examples": 1203}], "download_size": 344900092, "dataset_size": 765198398}}
2023-12-08T14:25:14+00:00
[]
[]
TAGS #region-us
# Dataset Card for "multitiny_id_rename_changed_order" More Information needed
[ "# Dataset Card for \"multitiny_id_rename_changed_order\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"multitiny_id_rename_changed_order\"\n\nMore Information needed" ]
[ 6, 22 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"multitiny_id_rename_changed_order\"\n\nMore Information needed" ]
6db340713cca817b086697ebae6974b46970713b
## Description Follow our journey across the most luxurious listings of the latent space! ## Model SVD ## Style - Luxury home tour ## Tags - Luxury - Tour ## Voice Julian ## Prompt A video channel which produces virtual tours of luxury homes. It often starts videos with titles like 'Touring a <price in dollar> Home That Will Shock You!", "never seen before". It features the most luxurious listings, doing a tour of all the rooms, but also the exterior. Exterior shots should be drone shots, and interior should be pro-quality shots.
jbilcke-hf/ai-tube-latent-estate
[ "license:cc-by-nc-sa-4.0", "region:us" ]
2023-12-08T14:39:17+00:00
{"license": "cc-by-nc-sa-4.0", "pretty_name": "Latent Estate"}
2023-12-09T16:49:21+00:00
[]
[]
TAGS #license-cc-by-nc-sa-4.0 #region-us
## Description Follow our journey across the most luxurious listings of the latent space! ## Model SVD ## Style - Luxury home tour ## Tags - Luxury - Tour ## Voice Julian ## Prompt A video channel which produces virtual tours of luxury homes. It often starts videos with titles like 'Touring a <price in dollar> Home That Will Shock You!", "never seen before". It features the most luxurious listings, doing a tour of all the rooms, but also the exterior. Exterior shots should be drone shots, and interior should be pro-quality shots.
[ "## Description\n\nFollow our journey across the most luxurious listings of the latent space!", "## Model\n\nSVD", "## Style\n\n- Luxury home tour", "## Tags\n\n- Luxury\n- Tour", "## Voice\n\nJulian", "## Prompt\n\nA video channel which produces virtual tours of luxury homes.\nIt often starts videos with titles like 'Touring a <price in dollar> Home That Will Shock You!\", \"never seen before\".\nIt features the most luxurious listings, doing a tour of all the rooms, but also the exterior.\nExterior shots should be drone shots, and interior should be pro-quality shots." ]
[ "TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n", "## Description\n\nFollow our journey across the most luxurious listings of the latent space!", "## Model\n\nSVD", "## Style\n\n- Luxury home tour", "## Tags\n\n- Luxury\n- Tour", "## Voice\n\nJulian", "## Prompt\n\nA video channel which produces virtual tours of luxury homes.\nIt often starts videos with titles like 'Touring a <price in dollar> Home That Will Shock You!\", \"never seen before\".\nIt features the most luxurious listings, doing a tour of all the rooms, but also the exterior.\nExterior shots should be drone shots, and interior should be pro-quality shots." ]
[ 19, 19, 4, 6, 6, 3, 92 ]
[ "passage: TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n## Description\n\nFollow our journey across the most luxurious listings of the latent space!## Model\n\nSVD## Style\n\n- Luxury home tour## Tags\n\n- Luxury\n- Tour## Voice\n\nJulian## Prompt\n\nA video channel which produces virtual tours of luxury homes.\nIt often starts videos with titles like 'Touring a <price in dollar> Home That Will Shock You!\", \"never seen before\".\nIt features the most luxurious listings, doing a tour of all the rooms, but also the exterior.\nExterior shots should be drone shots, and interior should be pro-quality shots." ]
6fa1da1c124a89198dce453d889aebd9437f529b
# Skin Lesions Dataset A dataset for 15 types of skin lesions classification consisted of merging [HAM10000(2018)](https://www.kaggle.com/datasets/kmader/skin-cancer-mnist-ham10000), [HAM10000(2019)](https://www.kaggle.com/datasets/salviohexia/isic-2019-skin-lesion-images-for-classification) and [MSLDv2.0](https://www.kaggle.com/datasets/joydippaul/mpox-skin-lesion-dataset-version-20-msld-v20) The dataset consisted of 15 categories: - Actinic keratoses - Basal cell carcinoma - Benign keratosis-like-lesions - Chickenpox - Cowpox - dermatofibroma - Dermatofibroma - Healthy - HFMD - Measles - Melanocytic nevi - Melanoma - Monkeypox - Squamous cell carcinoma - Vascular lesions ## Load the dataset ```python from datasets import load_dataset dataset = load_dataset("ahmed-ai/skin-lesions-dataset") ``` Citation for the original datasets MSLDv2.0 ``` @article{Nafisa2023, title={A Web-based Mpox Skin Lesion Detection System Using State-of-the-art Deep Learning Models Considering Racial Diversity}, author={Ali, Shams Nafisa and Ahmed, Md. Tazuddin and Jahan, Tasnim and Paul, Joydip and Sani, S. M. Sakeef and Noor, Nawshaba and Asma, Anzirun Nahar and Hasan, Taufiq}, journal={arXiv preprint arXiv:2306.14169}, year={2023} } ``` ### HAM10000 (2018) ``` @article{Tschandl2018_HAM10000, author = {Philipp Tschandl and Cliff Rosendahl and Harald Kittler}, title = {The {HAM10000} dataset, a large collection of multi-source dermatoscopic images of common pigmented skin lesions}, journal = {Sci. Data}, volume = {5}, year = {2018}, pages = {180161}, doi = {10.1038/sdata.2018.161} } ``` ### HAM10000 (2019) ``` BCN_20000 Dataset: (c) Department of Dermatology, Hospital Clínic de Barcelona HAM10000 Dataset: (c) by ViDIR Group, Department of Dermatology, Medical University of Vienna; https://doi.org/10.1038/sdata.2018.161 MSK Dataset: (c) Anonymous; https://arxiv.org/abs/1710.05006; https://arxiv.org/abs/1902.03368 ```
ahmed-ai/skin-lesions-dataset
[ "task_categories:image-classification", "medical", "healthcare", "biology", "cancer", "dermatology", "arxiv:1710.05006", "arxiv:1902.03368", "region:us" ]
2023-12-08T14:41:28+00:00
{"task_categories": ["image-classification"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Actinic keratoses", "1": "Basal cell carcinoma", "2": "Benign keratosis-like lesions", "3": "Chickenpox", "4": "Cowpox", "5": "Dermatofibroma", "6": "HFMD", "7": "Healthy", "8": "Measles", "9": "Melanocytic nevi", "10": "Melanoma", "11": "Monkeypox", "12": "Squamous cell carcinoma", "13": "Vascular lesions", "14": "dermatofibroma"}}}}], "splits": [{"name": "train", "num_bytes": 11732678535.328, "num_examples": 27396}], "download_size": 10183428110, "dataset_size": 11732678535.328}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["medical", "healthcare", "biology", "cancer", "dermatology"]}
2023-12-30T11:24:35+00:00
[ "1710.05006", "1902.03368" ]
[]
TAGS #task_categories-image-classification #medical #healthcare #biology #cancer #dermatology #arxiv-1710.05006 #arxiv-1902.03368 #region-us
# Skin Lesions Dataset A dataset for 15 types of skin lesions classification consisted of merging HAM10000(2018), HAM10000(2019) and MSLDv2.0 The dataset consisted of 15 categories: - Actinic keratoses - Basal cell carcinoma - Benign keratosis-like-lesions - Chickenpox - Cowpox - dermatofibroma - Dermatofibroma - Healthy - HFMD - Measles - Melanocytic nevi - Melanoma - Monkeypox - Squamous cell carcinoma - Vascular lesions ## Load the dataset Citation for the original datasets MSLDv2.0 ### HAM10000 (2018) ### HAM10000 (2019)
[ "# Skin Lesions Dataset\nA dataset for 15 types of skin lesions classification consisted of merging HAM10000(2018), HAM10000(2019) and MSLDv2.0\n\nThe dataset consisted of 15 categories:\n- Actinic keratoses\n- Basal cell carcinoma\n- Benign keratosis-like-lesions\n- Chickenpox\n- Cowpox\n- dermatofibroma\n- Dermatofibroma\n- Healthy\n- HFMD\n- Measles\n- Melanocytic nevi\n- Melanoma\n- Monkeypox\n- Squamous cell carcinoma\n- Vascular lesions", "## Load the dataset\n\n\n\nCitation for the original datasets\n\nMSLDv2.0", "### HAM10000 (2018)", "### HAM10000 (2019)" ]
[ "TAGS\n#task_categories-image-classification #medical #healthcare #biology #cancer #dermatology #arxiv-1710.05006 #arxiv-1902.03368 #region-us \n", "# Skin Lesions Dataset\nA dataset for 15 types of skin lesions classification consisted of merging HAM10000(2018), HAM10000(2019) and MSLDv2.0\n\nThe dataset consisted of 15 categories:\n- Actinic keratoses\n- Basal cell carcinoma\n- Benign keratosis-like-lesions\n- Chickenpox\n- Cowpox\n- dermatofibroma\n- Dermatofibroma\n- Healthy\n- HFMD\n- Measles\n- Melanocytic nevi\n- Melanoma\n- Monkeypox\n- Squamous cell carcinoma\n- Vascular lesions", "## Load the dataset\n\n\n\nCitation for the original datasets\n\nMSLDv2.0", "### HAM10000 (2018)", "### HAM10000 (2019)" ]
[ 49, 131, 18, 5, 7 ]
[ "passage: TAGS\n#task_categories-image-classification #medical #healthcare #biology #cancer #dermatology #arxiv-1710.05006 #arxiv-1902.03368 #region-us \n# Skin Lesions Dataset\nA dataset for 15 types of skin lesions classification consisted of merging HAM10000(2018), HAM10000(2019) and MSLDv2.0\n\nThe dataset consisted of 15 categories:\n- Actinic keratoses\n- Basal cell carcinoma\n- Benign keratosis-like-lesions\n- Chickenpox\n- Cowpox\n- dermatofibroma\n- Dermatofibroma\n- Healthy\n- HFMD\n- Measles\n- Melanocytic nevi\n- Melanoma\n- Monkeypox\n- Squamous cell carcinoma\n- Vascular lesions## Load the dataset\n\n\n\nCitation for the original datasets\n\nMSLDv2.0### HAM10000 (2018)### HAM10000 (2019)" ]
c8c6885ad3283c18fb24e328e1f12a7c02c77187
## Description Gameplay footage of various latent games! ## Model SVD ## LoRA veryVANYA/ps1-graphics-sdxl-v2 ## Tags - Gaming ## Voice Cloée ## Music Balearic deep house music ## Prompt A video channel managed by Athena, a famous 28yo gaming influencer. It generates gameplay video sessions of various unknown, strange or invented videogames (original stories, and not copies of existing franchises).
jbilcke-hf/ai-tube-neurogorgon
[ "license:cc-by-nc-sa-4.0", "region:us" ]
2023-12-08T14:47:43+00:00
{"license": "cc-by-nc-sa-4.0", "pretty_name": "Neurogorgon"}
2023-12-12T22:32:54+00:00
[]
[]
TAGS #license-cc-by-nc-sa-4.0 #region-us
## Description Gameplay footage of various latent games! ## Model SVD ## LoRA veryVANYA/ps1-graphics-sdxl-v2 ## Tags - Gaming ## Voice Cloée ## Music Balearic deep house music ## Prompt A video channel managed by Athena, a famous 28yo gaming influencer. It generates gameplay video sessions of various unknown, strange or invented videogames (original stories, and not copies of existing franchises).
[ "## Description\n\nGameplay footage of various latent games!", "## Model\n\nSVD", "## LoRA\n\nveryVANYA/ps1-graphics-sdxl-v2", "## Tags\n\n- Gaming", "## Voice\n\nCloée", "## Music\n\nBalearic deep house music", "## Prompt\n\nA video channel managed by Athena, a famous 28yo gaming influencer.\nIt generates gameplay video sessions of various unknown, strange or invented videogames (original stories, and not copies of existing franchises)." ]
[ "TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n", "## Description\n\nGameplay footage of various latent games!", "## Model\n\nSVD", "## LoRA\n\nveryVANYA/ps1-graphics-sdxl-v2", "## Tags\n\n- Gaming", "## Voice\n\nCloée", "## Music\n\nBalearic deep house music", "## Prompt\n\nA video channel managed by Athena, a famous 28yo gaming influencer.\nIt generates gameplay video sessions of various unknown, strange or invented videogames (original stories, and not copies of existing franchises)." ]
[ 19, 12, 4, 18, 4, 4, 8, 51 ]
[ "passage: TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n## Description\n\nGameplay footage of various latent games!## Model\n\nSVD## LoRA\n\nveryVANYA/ps1-graphics-sdxl-v2## Tags\n\n- Gaming## Voice\n\nCloée## Music\n\nBalearic deep house music## Prompt\n\nA video channel managed by Athena, a famous 28yo gaming influencer.\nIt generates gameplay video sessions of various unknown, strange or invented videogames (original stories, and not copies of existing franchises)." ]
c3963a515fde06b9ed56d8f766619c2fc7276dac
# Dataset Card for Evaluation run of KaeriJenti/kaori-70b-v1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/KaeriJenti/kaori-70b-v1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [KaeriJenti/kaori-70b-v1](https://huggingface.co/KaeriJenti/kaori-70b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_KaeriJenti__kaori-70b-v1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-08T14:48:24.732982](https://huggingface.co/datasets/open-llm-leaderboard/details_KaeriJenti__kaori-70b-v1/blob/main/results_2023-12-08T14-48-24.732982.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7060605188404532, "acc_stderr": 0.03018324713174517, "acc_norm": 0.710861868068879, "acc_norm_stderr": 0.030768133121540496, "mc1": 0.4149326805385557, "mc1_stderr": 0.017248314465805978, "mc2": 0.5881075564330039, "mc2_stderr": 0.014883748546839335 }, "harness|arc:challenge|25": { "acc": 0.6544368600682594, "acc_stderr": 0.013896938461145675, "acc_norm": 0.6979522184300341, "acc_norm_stderr": 0.013417519144716413 }, "harness|hellaswag|10": { "acc": 0.6815375423222466, "acc_stderr": 0.004649278153073816, "acc_norm": 0.8736307508464449, "acc_norm_stderr": 0.0033158599188575543 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6, "acc_stderr": 0.04232073695151589, "acc_norm": 0.6, "acc_norm_stderr": 0.04232073695151589 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8157894736842105, "acc_stderr": 0.0315469804508223, "acc_norm": 0.8157894736842105, "acc_norm_stderr": 0.0315469804508223 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.77, "acc_stderr": 0.04229525846816505, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7320754716981132, "acc_stderr": 0.027257260322494845, "acc_norm": 0.7320754716981132, "acc_norm_stderr": 0.027257260322494845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8333333333333334, "acc_stderr": 0.031164899666948617, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.031164899666948617 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6936170212765957, "acc_stderr": 0.03013590647851756, "acc_norm": 0.6936170212765957, "acc_norm_stderr": 0.03013590647851756 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6689655172413793, "acc_stderr": 0.03921545312467122, "acc_norm": 0.6689655172413793, "acc_norm_stderr": 0.03921545312467122 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.46825396825396826, "acc_stderr": 0.0256993528321318, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.0256993528321318 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5158730158730159, "acc_stderr": 0.044698818540726076, "acc_norm": 0.5158730158730159, "acc_norm_stderr": 0.044698818540726076 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8387096774193549, "acc_stderr": 0.0209233270064233, "acc_norm": 0.8387096774193549, "acc_norm_stderr": 0.0209233270064233 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.541871921182266, "acc_stderr": 0.03505630140785741, "acc_norm": 0.541871921182266, "acc_norm_stderr": 0.03505630140785741 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.78, "acc_stderr": 0.04163331998932262, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932262 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8484848484848485, "acc_stderr": 0.027998073798781668, "acc_norm": 0.8484848484848485, "acc_norm_stderr": 0.027998073798781668 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8939393939393939, "acc_stderr": 0.02193804773885312, "acc_norm": 0.8939393939393939, "acc_norm_stderr": 0.02193804773885312 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9326424870466321, "acc_stderr": 0.0180883938390789, "acc_norm": 0.9326424870466321, "acc_norm_stderr": 0.0180883938390789 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7102564102564103, "acc_stderr": 0.023000628243687975, "acc_norm": 0.7102564102564103, "acc_norm_stderr": 0.023000628243687975 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.02866120111652458, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.02866120111652458 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.773109243697479, "acc_stderr": 0.027205371538279476, "acc_norm": 0.773109243697479, "acc_norm_stderr": 0.027205371538279476 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4768211920529801, "acc_stderr": 0.04078093859163083, "acc_norm": 0.4768211920529801, "acc_norm_stderr": 0.04078093859163083 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8972477064220183, "acc_stderr": 0.01301824650917377, "acc_norm": 0.8972477064220183, "acc_norm_stderr": 0.01301824650917377 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6435185185185185, "acc_stderr": 0.032664783315272714, "acc_norm": 0.6435185185185185, "acc_norm_stderr": 0.032664783315272714 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9068627450980392, "acc_stderr": 0.020397853969426987, "acc_norm": 0.9068627450980392, "acc_norm_stderr": 0.020397853969426987 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8945147679324894, "acc_stderr": 0.01999556072375854, "acc_norm": 0.8945147679324894, "acc_norm_stderr": 0.01999556072375854 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7713004484304933, "acc_stderr": 0.028188240046929203, "acc_norm": 0.7713004484304933, "acc_norm_stderr": 0.028188240046929203 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8702290076335878, "acc_stderr": 0.029473649496907065, "acc_norm": 0.8702290076335878, "acc_norm_stderr": 0.029473649496907065 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8677685950413223, "acc_stderr": 0.030922788320445784, "acc_norm": 0.8677685950413223, "acc_norm_stderr": 0.030922788320445784 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8333333333333334, "acc_stderr": 0.03602814176392645, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.03602814176392645 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8343558282208589, "acc_stderr": 0.029208296231259104, "acc_norm": 0.8343558282208589, "acc_norm_stderr": 0.029208296231259104 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5446428571428571, "acc_stderr": 0.04726835553719098, "acc_norm": 0.5446428571428571, "acc_norm_stderr": 0.04726835553719098 }, "harness|hendrycksTest-management|5": { "acc": 0.8349514563106796, "acc_stderr": 0.03675668832233188, "acc_norm": 0.8349514563106796, "acc_norm_stderr": 0.03675668832233188 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9017094017094017, "acc_stderr": 0.019503444900757567, "acc_norm": 0.9017094017094017, "acc_norm_stderr": 0.019503444900757567 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8646232439335888, "acc_stderr": 0.012234384586856491, "acc_norm": 0.8646232439335888, "acc_norm_stderr": 0.012234384586856491 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7803468208092486, "acc_stderr": 0.022289638852617887, "acc_norm": 0.7803468208092486, "acc_norm_stderr": 0.022289638852617887 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.535195530726257, "acc_stderr": 0.01668102093107665, "acc_norm": 0.535195530726257, "acc_norm_stderr": 0.01668102093107665 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7581699346405228, "acc_stderr": 0.024518195641879334, "acc_norm": 0.7581699346405228, "acc_norm_stderr": 0.024518195641879334 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7813504823151125, "acc_stderr": 0.02347558141786111, "acc_norm": 0.7813504823151125, "acc_norm_stderr": 0.02347558141786111 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8302469135802469, "acc_stderr": 0.02088869041409387, "acc_norm": 0.8302469135802469, "acc_norm_stderr": 0.02088869041409387 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5780141843971631, "acc_stderr": 0.029462189233370586, "acc_norm": 0.5780141843971631, "acc_norm_stderr": 0.029462189233370586 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5619295958279009, "acc_stderr": 0.012671902782567638, "acc_norm": 0.5619295958279009, "acc_norm_stderr": 0.012671902782567638 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7610294117647058, "acc_stderr": 0.02590528064489301, "acc_norm": 0.7610294117647058, "acc_norm_stderr": 0.02590528064489301 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7549019607843137, "acc_stderr": 0.01740181671142765, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.01740181671142765 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04265792110940589, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04265792110940589 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8040816326530612, "acc_stderr": 0.025409301953225678, "acc_norm": 0.8040816326530612, "acc_norm_stderr": 0.025409301953225678 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8507462686567164, "acc_stderr": 0.025196929874827075, "acc_norm": 0.8507462686567164, "acc_norm_stderr": 0.025196929874827075 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.91, "acc_stderr": 0.028762349126466108, "acc_norm": 0.91, "acc_norm_stderr": 0.028762349126466108 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.03882310850890594, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.03882310850890594 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8654970760233918, "acc_stderr": 0.026168221344662297, "acc_norm": 0.8654970760233918, "acc_norm_stderr": 0.026168221344662297 }, "harness|truthfulqa:mc|0": { "mc1": 0.4149326805385557, "mc1_stderr": 0.017248314465805978, "mc2": 0.5881075564330039, "mc2_stderr": 0.014883748546839335 }, "harness|winogrande|5": { "acc": 0.840568271507498, "acc_stderr": 0.010288617479454764 }, "harness|gsm8k|5": { "acc": 0.5238817285822593, "acc_stderr": 0.013756765835465755 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_KaeriJenti__kaori-70b-v1
[ "region:us" ]
2023-12-08T14:51:25+00:00
{"pretty_name": "Evaluation run of KaeriJenti/kaori-70b-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [KaeriJenti/kaori-70b-v1](https://huggingface.co/KaeriJenti/kaori-70b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KaeriJenti__kaori-70b-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T14:48:24.732982](https://huggingface.co/datasets/open-llm-leaderboard/details_KaeriJenti__kaori-70b-v1/blob/main/results_2023-12-08T14-48-24.732982.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7060605188404532,\n \"acc_stderr\": 0.03018324713174517,\n \"acc_norm\": 0.710861868068879,\n \"acc_norm_stderr\": 0.030768133121540496,\n \"mc1\": 0.4149326805385557,\n \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5881075564330039,\n \"mc2_stderr\": 0.014883748546839335\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6544368600682594,\n \"acc_stderr\": 0.013896938461145675,\n \"acc_norm\": 0.6979522184300341,\n \"acc_norm_stderr\": 0.013417519144716413\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6815375423222466,\n \"acc_stderr\": 0.004649278153073816,\n \"acc_norm\": 0.8736307508464449,\n \"acc_norm_stderr\": 0.0033158599188575543\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6936170212765957,\n \"acc_stderr\": 0.03013590647851756,\n \"acc_norm\": 0.6936170212765957,\n \"acc_norm_stderr\": 0.03013590647851756\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.03921545312467122,\n \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.03921545312467122\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.0256993528321318,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.0256993528321318\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8387096774193549,\n \"acc_stderr\": 0.0209233270064233,\n \"acc_norm\": 0.8387096774193549,\n \"acc_norm_stderr\": 0.0209233270064233\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781668,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781668\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8939393939393939,\n \"acc_stderr\": 0.02193804773885312,\n \"acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.02193804773885312\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.0180883938390789,\n \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.0180883938390789\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687975,\n \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687975\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.027205371538279476,\n \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.027205371538279476\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8972477064220183,\n \"acc_stderr\": 0.01301824650917377,\n \"acc_norm\": 0.8972477064220183,\n \"acc_norm_stderr\": 0.01301824650917377\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9068627450980392,\n \"acc_stderr\": 0.020397853969426987,\n \"acc_norm\": 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969426987\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8945147679324894,\n \"acc_stderr\": 0.01999556072375854,\n \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.01999556072375854\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7713004484304933,\n \"acc_stderr\": 0.028188240046929203,\n \"acc_norm\": 0.7713004484304933,\n \"acc_norm_stderr\": 0.028188240046929203\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445784,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445784\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8646232439335888,\n \"acc_stderr\": 0.012234384586856491,\n \"acc_norm\": 0.8646232439335888,\n \"acc_norm_stderr\": 0.012234384586856491\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.022289638852617887,\n \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.022289638852617887\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.535195530726257,\n \"acc_stderr\": 0.01668102093107665,\n \"acc_norm\": 0.535195530726257,\n \"acc_norm_stderr\": 0.01668102093107665\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.7813504823151125,\n \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8302469135802469,\n \"acc_stderr\": 0.02088869041409387,\n \"acc_norm\": 0.8302469135802469,\n \"acc_norm_stderr\": 0.02088869041409387\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5780141843971631,\n \"acc_stderr\": 0.029462189233370586,\n \"acc_norm\": 0.5780141843971631,\n \"acc_norm_stderr\": 0.029462189233370586\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5619295958279009,\n \"acc_stderr\": 0.012671902782567638,\n \"acc_norm\": 0.5619295958279009,\n \"acc_norm_stderr\": 0.012671902782567638\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7610294117647058,\n \"acc_stderr\": 0.02590528064489301,\n \"acc_norm\": 0.7610294117647058,\n \"acc_norm_stderr\": 0.02590528064489301\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.01740181671142765,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.01740181671142765\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.025409301953225678,\n \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.025409301953225678\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4149326805385557,\n \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5881075564330039,\n \"mc2_stderr\": 0.014883748546839335\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5238817285822593,\n \"acc_stderr\": 0.013756765835465755\n }\n}\n```", "repo_url": "https://huggingface.co/KaeriJenti/kaori-70b-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|arc:challenge|25_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|gsm8k|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hellaswag|10_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T14-48-24.732982.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["**/details_harness|winogrande|5_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T14-48-24.732982.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T14_48_24.732982", "path": ["results_2023-12-08T14-48-24.732982.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T14-48-24.732982.parquet"]}]}]}
2023-12-08T14:52:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of KaeriJenti/kaori-70b-v1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model KaeriJenti/kaori-70b-v1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-08T14:48:24.732982(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of KaeriJenti/kaori-70b-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KaeriJenti/kaori-70b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T14:48:24.732982(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of KaeriJenti/kaori-70b-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KaeriJenti/kaori-70b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T14:48:24.732982(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KaeriJenti/kaori-70b-v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model KaeriJenti/kaori-70b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-08T14:48:24.732982(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
b3e65bfdb13f24e86abe3493b4c1eb2197ee8e5c
# The Tiny Lego Dataset, or TLD
goldpotatoes/TLD
[ "size_categories:n<1K", "language:en", "region:us" ]
2023-12-08T14:55:41+00:00
{"language": ["en"], "size_categories": ["n<1K"]}
2023-12-08T14:57:54+00:00
[]
[ "en" ]
TAGS #size_categories-n<1K #language-English #region-us
# The Tiny Lego Dataset, or TLD
[ "# The Tiny Lego Dataset, or TLD" ]
[ "TAGS\n#size_categories-n<1K #language-English #region-us \n", "# The Tiny Lego Dataset, or TLD" ]
[ 20, 11 ]
[ "passage: TAGS\n#size_categories-n<1K #language-English #region-us \n# The Tiny Lego Dataset, or TLD" ]
ebd0a16f9358687153ea63b564d6914a4cae1dd4
# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-chat-v4-4k ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TigerResearch/tigerbot-70b-chat-v4-4k - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TigerResearch/tigerbot-70b-chat-v4-4k](https://huggingface.co/TigerResearch/tigerbot-70b-chat-v4-4k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TigerResearch__tigerbot-70b-chat-v4-4k", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-08T14:57:16.258420](https://huggingface.co/datasets/open-llm-leaderboard/details_TigerResearch__tigerbot-70b-chat-v4-4k/blob/main/results_2023-12-08T14-57-16.258420.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6955502460711549, "acc_stderr": 0.030852866408872186, "acc_norm": 0.6929057392816266, "acc_norm_stderr": 0.03150231316215322, "mc1": 0.8776009791921665, "mc1_stderr": 0.011473408114683024, "mc2": 0.8997586773611238, "mc2_stderr": 0.00887058109706705 }, "harness|arc:challenge|25": { "acc": 0.9889078498293515, "acc_stderr": 0.003060605363008861, "acc_norm": 0.9889078498293515, "acc_norm_stderr": 0.0030606053630088544 }, "harness|hellaswag|10": { "acc": 0.9523003385779725, "acc_stderr": 0.0021269443841646345, "acc_norm": 0.9856602270464051, "acc_norm_stderr": 0.0011864413386333608 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.04284958639753401, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.04284958639753401 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.756578947368421, "acc_stderr": 0.034923496688842384, "acc_norm": 0.756578947368421, "acc_norm_stderr": 0.034923496688842384 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.73, "acc_stderr": 0.0446196043338474, "acc_norm": 0.73, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7018867924528301, "acc_stderr": 0.028152837942493864, "acc_norm": 0.7018867924528301, "acc_norm_stderr": 0.028152837942493864 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.75, "acc_stderr": 0.03621034121889507, "acc_norm": 0.75, "acc_norm_stderr": 0.03621034121889507 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.047840607041056527, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.047840607041056527 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909281, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909281 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6638297872340425, "acc_stderr": 0.030881618520676942, "acc_norm": 0.6638297872340425, "acc_norm_stderr": 0.030881618520676942 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.543859649122807, "acc_stderr": 0.046854730419077895, "acc_norm": 0.543859649122807, "acc_norm_stderr": 0.046854730419077895 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6275862068965518, "acc_stderr": 0.04028731532947559, "acc_norm": 0.6275862068965518, "acc_norm_stderr": 0.04028731532947559 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4576719576719577, "acc_stderr": 0.025658868862058325, "acc_norm": 0.4576719576719577, "acc_norm_stderr": 0.025658868862058325 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5, "acc_stderr": 0.04472135954999579, "acc_norm": 0.5, "acc_norm_stderr": 0.04472135954999579 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8129032258064516, "acc_stderr": 0.02218571009225225, "acc_norm": 0.8129032258064516, "acc_norm_stderr": 0.02218571009225225 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4729064039408867, "acc_stderr": 0.03512819077876106, "acc_norm": 0.4729064039408867, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.74, "acc_stderr": 0.04408440022768077, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768077 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8909090909090909, "acc_stderr": 0.02434383813514564, "acc_norm": 0.8909090909090909, "acc_norm_stderr": 0.02434383813514564 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.02860620428922987, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.02860620428922987 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8808290155440415, "acc_stderr": 0.023381935348121427, "acc_norm": 0.8808290155440415, "acc_norm_stderr": 0.023381935348121427 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6897435897435897, "acc_stderr": 0.023454674889404295, "acc_norm": 0.6897435897435897, "acc_norm_stderr": 0.023454674889404295 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34444444444444444, "acc_stderr": 0.02897264888484427, "acc_norm": 0.34444444444444444, "acc_norm_stderr": 0.02897264888484427 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7605042016806722, "acc_stderr": 0.02772206549336127, "acc_norm": 0.7605042016806722, "acc_norm_stderr": 0.02772206549336127 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4370860927152318, "acc_stderr": 0.04050035722230636, "acc_norm": 0.4370860927152318, "acc_norm_stderr": 0.04050035722230636 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8697247706422019, "acc_stderr": 0.01443186285247327, "acc_norm": 0.8697247706422019, "acc_norm_stderr": 0.01443186285247327 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5648148148148148, "acc_stderr": 0.033812000056435254, "acc_norm": 0.5648148148148148, "acc_norm_stderr": 0.033812000056435254 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9215686274509803, "acc_stderr": 0.01886951464665893, "acc_norm": 0.9215686274509803, "acc_norm_stderr": 0.01886951464665893 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9451476793248945, "acc_stderr": 0.014821471997344078, "acc_norm": 0.9451476793248945, "acc_norm_stderr": 0.014821471997344078 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7309417040358744, "acc_stderr": 0.029763779406874972, "acc_norm": 0.7309417040358744, "acc_norm_stderr": 0.029763779406874972 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8347107438016529, "acc_stderr": 0.03390780612972776, "acc_norm": 0.8347107438016529, "acc_norm_stderr": 0.03390780612972776 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8148148148148148, "acc_stderr": 0.03755265865037182, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.03755265865037182 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8282208588957055, "acc_stderr": 0.02963471727237102, "acc_norm": 0.8282208588957055, "acc_norm_stderr": 0.02963471727237102 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5535714285714286, "acc_stderr": 0.047184714852195865, "acc_norm": 0.5535714285714286, "acc_norm_stderr": 0.047184714852195865 }, "harness|hendrycksTest-management|5": { "acc": 0.8349514563106796, "acc_stderr": 0.036756688322331886, "acc_norm": 0.8349514563106796, "acc_norm_stderr": 0.036756688322331886 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9017094017094017, "acc_stderr": 0.019503444900757567, "acc_norm": 0.9017094017094017, "acc_norm_stderr": 0.019503444900757567 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.013547415658662255, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.013547415658662255 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7716763005780347, "acc_stderr": 0.022598703804321635, "acc_norm": 0.7716763005780347, "acc_norm_stderr": 0.022598703804321635 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.5541899441340782, "acc_stderr": 0.016623998513333103, "acc_norm": 0.5541899441340782, "acc_norm_stderr": 0.016623998513333103 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.025829163272757475, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.025829163272757475 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.025494259350694912, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.025494259350694912 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600713, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600713 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5390070921985816, "acc_stderr": 0.02973659252642444, "acc_norm": 0.5390070921985816, "acc_norm_stderr": 0.02973659252642444 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.621251629726206, "acc_stderr": 0.01238905210500374, "acc_norm": 0.621251629726206, "acc_norm_stderr": 0.01238905210500374 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8235294117647058, "acc_stderr": 0.02315746830855935, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.02315746830855935 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7369281045751634, "acc_stderr": 0.017812676542320657, "acc_norm": 0.7369281045751634, "acc_norm_stderr": 0.017812676542320657 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.02812342933514278, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.02812342933514278 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8507462686567164, "acc_stderr": 0.025196929874827075, "acc_norm": 0.8507462686567164, "acc_norm_stderr": 0.025196929874827075 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.029547741687640038, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.029547741687640038 }, "harness|truthfulqa:mc|0": { "mc1": 0.8776009791921665, "mc1_stderr": 0.011473408114683024, "mc2": 0.8997586773611238, "mc2_stderr": 0.00887058109706705 }, "harness|winogrande|5": { "acc": 0.7434885556432518, "acc_stderr": 0.012273648008759987 }, "harness|gsm8k|5": { "acc": 0.8369977255496588, "acc_stderr": 0.01017422331987246 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TigerResearch__tigerbot-70b-chat-v4-4k
[ "region:us" ]
2023-12-08T15:00:19+00:00
{"pretty_name": "Evaluation run of TigerResearch/tigerbot-70b-chat-v4-4k", "dataset_summary": "Dataset automatically created during the evaluation run of model [TigerResearch/tigerbot-70b-chat-v4-4k](https://huggingface.co/TigerResearch/tigerbot-70b-chat-v4-4k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TigerResearch__tigerbot-70b-chat-v4-4k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T14:57:16.258420](https://huggingface.co/datasets/open-llm-leaderboard/details_TigerResearch__tigerbot-70b-chat-v4-4k/blob/main/results_2023-12-08T14-57-16.258420.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6955502460711549,\n \"acc_stderr\": 0.030852866408872186,\n \"acc_norm\": 0.6929057392816266,\n \"acc_norm_stderr\": 0.03150231316215322,\n \"mc1\": 0.8776009791921665,\n \"mc1_stderr\": 0.011473408114683024,\n \"mc2\": 0.8997586773611238,\n \"mc2_stderr\": 0.00887058109706705\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.9889078498293515,\n \"acc_stderr\": 0.003060605363008861,\n \"acc_norm\": 0.9889078498293515,\n \"acc_norm_stderr\": 0.0030606053630088544\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.9523003385779725,\n \"acc_stderr\": 0.0021269443841646345,\n \"acc_norm\": 0.9856602270464051,\n \"acc_norm_stderr\": 0.0011864413386333608\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.543859649122807,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.543859649122807,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947559,\n \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947559\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4576719576719577,\n \"acc_stderr\": 0.025658868862058325,\n \"acc_norm\": 0.4576719576719577,\n \"acc_norm_stderr\": 0.025658868862058325\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n \"acc_stderr\": 0.02218571009225225,\n \"acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.02218571009225225\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8909090909090909,\n \"acc_stderr\": 0.02434383813514564,\n \"acc_norm\": 0.8909090909090909,\n \"acc_norm_stderr\": 0.02434383813514564\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6897435897435897,\n \"acc_stderr\": 0.023454674889404295,\n \"acc_norm\": 0.6897435897435897,\n \"acc_norm_stderr\": 0.023454674889404295\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.02772206549336127,\n \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.02772206549336127\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8697247706422019,\n \"acc_stderr\": 0.01443186285247327,\n \"acc_norm\": 0.8697247706422019,\n \"acc_norm_stderr\": 0.01443186285247327\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.01886951464665893,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.01886951464665893\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9451476793248945,\n \"acc_stderr\": 0.014821471997344078,\n \"acc_norm\": 0.9451476793248945,\n \"acc_norm_stderr\": 0.014821471997344078\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7309417040358744,\n \"acc_stderr\": 0.029763779406874972,\n \"acc_norm\": 0.7309417040358744,\n \"acc_norm_stderr\": 0.029763779406874972\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.02963471727237102,\n \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.02963471727237102\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n \"acc_stderr\": 0.047184714852195865,\n \"acc_norm\": 0.5535714285714286,\n \"acc_norm_stderr\": 0.047184714852195865\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.013547415658662255,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.013547415658662255\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7716763005780347,\n \"acc_stderr\": 0.022598703804321635,\n \"acc_norm\": 0.7716763005780347,\n \"acc_norm_stderr\": 0.022598703804321635\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5541899441340782,\n \"acc_stderr\": 0.016623998513333103,\n \"acc_norm\": 0.5541899441340782,\n \"acc_norm_stderr\": 0.016623998513333103\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757475,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757475\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5390070921985816,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.5390070921985816,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.621251629726206,\n \"acc_stderr\": 0.01238905210500374,\n \"acc_norm\": 0.621251629726206,\n \"acc_norm_stderr\": 0.01238905210500374\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02315746830855935,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02315746830855935\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7369281045751634,\n \"acc_stderr\": 0.017812676542320657,\n \"acc_norm\": 0.7369281045751634,\n \"acc_norm_stderr\": 0.017812676542320657\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.8776009791921665,\n \"mc1_stderr\": 0.011473408114683024,\n \"mc2\": 0.8997586773611238,\n \"mc2_stderr\": 0.00887058109706705\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759987\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.8369977255496588,\n \"acc_stderr\": 0.01017422331987246\n }\n}\n```", "repo_url": "https://huggingface.co/TigerResearch/tigerbot-70b-chat-v4-4k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|arc:challenge|25_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|gsm8k|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hellaswag|10_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T14-57-16.258420.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["**/details_harness|winogrande|5_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T14-57-16.258420.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T14_57_16.258420", "path": ["results_2023-12-08T14-57-16.258420.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T14-57-16.258420.parquet"]}]}]}
2023-12-08T15:01:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-chat-v4-4k ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TigerResearch/tigerbot-70b-chat-v4-4k on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-08T14:57:16.258420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-chat-v4-4k", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TigerResearch/tigerbot-70b-chat-v4-4k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T14:57:16.258420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-chat-v4-4k", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TigerResearch/tigerbot-70b-chat-v4-4k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T14:57:16.258420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-chat-v4-4k## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TigerResearch/tigerbot-70b-chat-v4-4k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-08T14:57:16.258420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c13f99d8b59118f63f9f67876e9e67b90578315f
# Dataset Documentation ## Overview This dataset contains 1000 stories spanning 100 different genres. Each story is represented in a tabular format using a dataframe. The dataset includes unique IDs, titles, and the content of each story. ## Genre List The list of all genres can be found in the [genres.txt](https://huggingface.co/datasets/FareedKhan/1k_stories_100_genre/blob/main/story_genres.pkl) file. reading genre_list variable ```python with open('story_genres.pkl', 'rb') as f: story_genres = pickle.load(f) ``` Sample of genre list: ```python genres = ['Sci-Fi', 'Comedy', ...] ``` ## Dataframe Format The dataset is structured in the following format: 1. **id**: Unique identifier for each dataframe. 2. **title**: Title of the story. 3. **story**: The content of the story. 4. **genre**: The genre of the story. ## Sample Dataframe | id | title | story | genre | |-------|----------------------|--------------------------------------------| ------ | | 25235 | The Unseen Miracle | It was a stormy night in ... | Horror | | ... | ... | ... | ... | ## Average Length of Words - Title: 6 words - Story: 960 words # License This dataset is licensed under the [cc-by-2.0](https://creativecommons.org/licenses/by/2.0/deed.en)
FareedKhan/1k_stories_100_genre
[ "task_categories:summarization", "task_categories:text-generation", "task_categories:text-classification", "size_categories:1K<n<10K", "language:en", "license:cc-by-2.0", "data science", "Storytelling", "Genre Classification", "NLP", "LLM", "Deep Learning", "region:us" ]
2023-12-08T15:09:24+00:00
{"language": ["en"], "license": "cc-by-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["summarization", "text-generation", "text-classification"], "pretty_name": "Thousand Stories, Hundred Genres", "tags": ["data science", "Storytelling", "Genre Classification", "NLP", "LLM", "Deep Learning"]}
2023-12-08T17:48:44+00:00
[]
[ "en" ]
TAGS #task_categories-summarization #task_categories-text-generation #task_categories-text-classification #size_categories-1K<n<10K #language-English #license-cc-by-2.0 #data science #Storytelling #Genre Classification #NLP #LLM #Deep Learning #region-us
Dataset Documentation ===================== Overview -------- This dataset contains 1000 stories spanning 100 different genres. Each story is represented in a tabular format using a dataframe. The dataset includes unique IDs, titles, and the content of each story. Genre List ---------- The list of all genres can be found in the URL file. reading genre\_list variable Sample of genre list: Dataframe Format ---------------- The dataset is structured in the following format: 1. id: Unique identifier for each dataframe. 2. title: Title of the story. 3. story: The content of the story. 4. genre: The genre of the story. Sample Dataframe ---------------- Average Length of Words ----------------------- * Title: 6 words * Story: 960 words License ======= This dataset is licensed under the cc-by-2.0
[]
[ "TAGS\n#task_categories-summarization #task_categories-text-generation #task_categories-text-classification #size_categories-1K<n<10K #language-English #license-cc-by-2.0 #data science #Storytelling #Genre Classification #NLP #LLM #Deep Learning #region-us \n" ]
[ 85 ]
[ "passage: TAGS\n#task_categories-summarization #task_categories-text-generation #task_categories-text-classification #size_categories-1K<n<10K #language-English #license-cc-by-2.0 #data science #Storytelling #Genre Classification #NLP #LLM #Deep Learning #region-us \n" ]
6fb6047f47a07660aa7d79628d60afefd9411aa1
# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-67b-base ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/deepseek-ai/deepseek-llm-67b-base - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-llm-67b-base](https://huggingface.co/deepseek-ai/deepseek-llm-67b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_deepseek-ai__deepseek-llm-67b-base", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-08T15:08:33.397139](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-llm-67b-base/blob/main/results_2023-12-08T15-08-33.397139.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7152016597064887, "acc_stderr": 0.029610855644222223, "acc_norm": 0.7193168663591899, "acc_norm_stderr": 0.030182859586413653, "mc1": 0.3525091799265606, "mc1_stderr": 0.016724646380756544, "mc2": 0.5108013665291756, "mc2_stderr": 0.014538753767819627 }, "harness|arc:challenge|25": { "acc": 0.6262798634812287, "acc_stderr": 0.014137708601759096, "acc_norm": 0.6544368600682594, "acc_norm_stderr": 0.01389693846114568 }, "harness|hellaswag|10": { "acc": 0.6783509261103365, "acc_stderr": 0.004661544991583034, "acc_norm": 0.8710416251742681, "acc_norm_stderr": 0.003344689038650325 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6592592592592592, "acc_stderr": 0.040943762699967926, "acc_norm": 0.6592592592592592, "acc_norm_stderr": 0.040943762699967926 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8092105263157895, "acc_stderr": 0.03197565821032499, "acc_norm": 0.8092105263157895, "acc_norm_stderr": 0.03197565821032499 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.83, "acc_stderr": 0.0377525168068637, "acc_norm": 0.83, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7811320754716982, "acc_stderr": 0.025447863825108614, "acc_norm": 0.7811320754716982, "acc_norm_stderr": 0.025447863825108614 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8472222222222222, "acc_stderr": 0.030085743248565677, "acc_norm": 0.8472222222222222, "acc_norm_stderr": 0.030085743248565677 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7341040462427746, "acc_stderr": 0.03368762932259431, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.03368762932259431 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7319148936170212, "acc_stderr": 0.028957342788342343, "acc_norm": 0.7319148936170212, "acc_norm_stderr": 0.028957342788342343 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5175438596491229, "acc_stderr": 0.04700708033551038, "acc_norm": 0.5175438596491229, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6827586206896552, "acc_stderr": 0.03878352372138622, "acc_norm": 0.6827586206896552, "acc_norm_stderr": 0.03878352372138622 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.5026455026455027, "acc_stderr": 0.025750949678130387, "acc_norm": 0.5026455026455027, "acc_norm_stderr": 0.025750949678130387 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5317460317460317, "acc_stderr": 0.04463112720677173, "acc_norm": 0.5317460317460317, "acc_norm_stderr": 0.04463112720677173 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8258064516129032, "acc_stderr": 0.021576248184514587, "acc_norm": 0.8258064516129032, "acc_norm_stderr": 0.021576248184514587 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5714285714285714, "acc_stderr": 0.034819048444388045, "acc_norm": 0.5714285714285714, "acc_norm_stderr": 0.034819048444388045 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.74, "acc_stderr": 0.044084400227680794, "acc_norm": 0.74, "acc_norm_stderr": 0.044084400227680794 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8181818181818182, "acc_stderr": 0.030117688929503582, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.030117688929503582 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.898989898989899, "acc_stderr": 0.02146973557605533, "acc_norm": 0.898989898989899, "acc_norm_stderr": 0.02146973557605533 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9844559585492227, "acc_stderr": 0.008927492715084313, "acc_norm": 0.9844559585492227, "acc_norm_stderr": 0.008927492715084313 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7307692307692307, "acc_stderr": 0.02248938979365483, "acc_norm": 0.7307692307692307, "acc_norm_stderr": 0.02248938979365483 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.37777777777777777, "acc_stderr": 0.029560707392465715, "acc_norm": 0.37777777777777777, "acc_norm_stderr": 0.029560707392465715 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8235294117647058, "acc_stderr": 0.024762902678057922, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.024762902678057922 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4370860927152318, "acc_stderr": 0.04050035722230636, "acc_norm": 0.4370860927152318, "acc_norm_stderr": 0.04050035722230636 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9045871559633027, "acc_stderr": 0.012595899282335812, "acc_norm": 0.9045871559633027, "acc_norm_stderr": 0.012595899282335812 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6435185185185185, "acc_stderr": 0.032664783315272714, "acc_norm": 0.6435185185185185, "acc_norm_stderr": 0.032664783315272714 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9166666666666666, "acc_stderr": 0.019398452135813905, "acc_norm": 0.9166666666666666, "acc_norm_stderr": 0.019398452135813905 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9156118143459916, "acc_stderr": 0.01809424711647333, "acc_norm": 0.9156118143459916, "acc_norm_stderr": 0.01809424711647333 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7937219730941704, "acc_stderr": 0.02715715047956382, "acc_norm": 0.7937219730941704, "acc_norm_stderr": 0.02715715047956382 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8244274809160306, "acc_stderr": 0.03336820338476074, "acc_norm": 0.8244274809160306, "acc_norm_stderr": 0.03336820338476074 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8512396694214877, "acc_stderr": 0.03248470083807193, "acc_norm": 0.8512396694214877, "acc_norm_stderr": 0.03248470083807193 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8333333333333334, "acc_stderr": 0.03602814176392645, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.03602814176392645 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8343558282208589, "acc_stderr": 0.029208296231259104, "acc_norm": 0.8343558282208589, "acc_norm_stderr": 0.029208296231259104 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5, "acc_stderr": 0.04745789978762494, "acc_norm": 0.5, "acc_norm_stderr": 0.04745789978762494 }, "harness|hendrycksTest-management|5": { "acc": 0.9029126213592233, "acc_stderr": 0.02931596291881348, "acc_norm": 0.9029126213592233, "acc_norm_stderr": 0.02931596291881348 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9316239316239316, "acc_stderr": 0.016534627684311368, "acc_norm": 0.9316239316239316, "acc_norm_stderr": 0.016534627684311368 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9003831417624522, "acc_stderr": 0.010709685591251671, "acc_norm": 0.9003831417624522, "acc_norm_stderr": 0.010709685591251671 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7630057803468208, "acc_stderr": 0.02289408248992599, "acc_norm": 0.7630057803468208, "acc_norm_stderr": 0.02289408248992599 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4491620111731844, "acc_stderr": 0.01663583834163192, "acc_norm": 0.4491620111731844, "acc_norm_stderr": 0.01663583834163192 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7875816993464052, "acc_stderr": 0.02342037547829613, "acc_norm": 0.7875816993464052, "acc_norm_stderr": 0.02342037547829613 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8135048231511254, "acc_stderr": 0.02212243977248077, "acc_norm": 0.8135048231511254, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8549382716049383, "acc_stderr": 0.01959487701972795, "acc_norm": 0.8549382716049383, "acc_norm_stderr": 0.01959487701972795 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5567375886524822, "acc_stderr": 0.029634838473766006, "acc_norm": 0.5567375886524822, "acc_norm_stderr": 0.029634838473766006 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.560625814863103, "acc_stderr": 0.012676014778580219, "acc_norm": 0.560625814863103, "acc_norm_stderr": 0.012676014778580219 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7757352941176471, "acc_stderr": 0.025336848563332372, "acc_norm": 0.7757352941176471, "acc_norm_stderr": 0.025336848563332372 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.8022875816993464, "acc_stderr": 0.016112443369726736, "acc_norm": 0.8022875816993464, "acc_norm_stderr": 0.016112443369726736 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7181818181818181, "acc_stderr": 0.043091187099464585, "acc_norm": 0.7181818181818181, "acc_norm_stderr": 0.043091187099464585 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.02812342933514279, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.02812342933514279 }, "harness|hendrycksTest-sociology|5": { "acc": 0.900497512437811, "acc_stderr": 0.0211662163046594, "acc_norm": 0.900497512437811, "acc_norm_stderr": 0.0211662163046594 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.94, "acc_stderr": 0.02386832565759419, "acc_norm": 0.94, "acc_norm_stderr": 0.02386832565759419 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.038695433234721015, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.038695433234721015 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8596491228070176, "acc_stderr": 0.0266405825391332, "acc_norm": 0.8596491228070176, "acc_norm_stderr": 0.0266405825391332 }, "harness|truthfulqa:mc|0": { "mc1": 0.3525091799265606, "mc1_stderr": 0.016724646380756544, "mc2": 0.5108013665291756, "mc2_stderr": 0.014538753767819627 }, "harness|winogrande|5": { "acc": 0.8413575374901342, "acc_stderr": 0.01026793624302822 }, "harness|gsm8k|5": { "acc": 0.5670962850644428, "acc_stderr": 0.013647916362576052 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_deepseek-ai__deepseek-llm-67b-base
[ "region:us" ]
2023-12-08T15:11:12+00:00
{"pretty_name": "Evaluation run of deepseek-ai/deepseek-llm-67b-base", "dataset_summary": "Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-llm-67b-base](https://huggingface.co/deepseek-ai/deepseek-llm-67b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepseek-ai__deepseek-llm-67b-base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T15:08:33.397139](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-llm-67b-base/blob/main/results_2023-12-08T15-08-33.397139.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7152016597064887,\n \"acc_stderr\": 0.029610855644222223,\n \"acc_norm\": 0.7193168663591899,\n \"acc_norm_stderr\": 0.030182859586413653,\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756544,\n \"mc2\": 0.5108013665291756,\n \"mc2_stderr\": 0.014538753767819627\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759096,\n \"acc_norm\": 0.6544368600682594,\n \"acc_norm_stderr\": 0.01389693846114568\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6783509261103365,\n \"acc_stderr\": 0.004661544991583034,\n \"acc_norm\": 0.8710416251742681,\n \"acc_norm_stderr\": 0.003344689038650325\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.03197565821032499,\n \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.03197565821032499\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7811320754716982,\n \"acc_stderr\": 0.025447863825108614,\n \"acc_norm\": 0.7811320754716982,\n \"acc_norm_stderr\": 0.025447863825108614\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n \"acc_stderr\": 0.030085743248565677,\n \"acc_norm\": 0.8472222222222222,\n \"acc_norm_stderr\": 0.030085743248565677\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7319148936170212,\n \"acc_stderr\": 0.028957342788342343,\n \"acc_norm\": 0.7319148936170212,\n \"acc_norm_stderr\": 0.028957342788342343\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6827586206896552,\n \"acc_stderr\": 0.03878352372138622,\n \"acc_norm\": 0.6827586206896552,\n \"acc_norm_stderr\": 0.03878352372138622\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5026455026455027,\n \"acc_stderr\": 0.025750949678130387,\n \"acc_norm\": 0.5026455026455027,\n \"acc_norm_stderr\": 0.025750949678130387\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8258064516129032,\n \"acc_stderr\": 0.021576248184514587,\n \"acc_norm\": 0.8258064516129032,\n \"acc_norm_stderr\": 0.021576248184514587\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.034819048444388045,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.034819048444388045\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.030117688929503582,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.030117688929503582\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.898989898989899,\n \"acc_stderr\": 0.02146973557605533,\n \"acc_norm\": 0.898989898989899,\n \"acc_norm_stderr\": 0.02146973557605533\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084313,\n \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084313\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7307692307692307,\n \"acc_stderr\": 0.02248938979365483,\n \"acc_norm\": 0.7307692307692307,\n \"acc_norm_stderr\": 0.02248938979365483\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465715,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465715\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.024762902678057922,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.024762902678057922\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9045871559633027,\n \"acc_stderr\": 0.012595899282335812,\n \"acc_norm\": 0.9045871559633027,\n \"acc_norm_stderr\": 0.012595899282335812\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9156118143459916,\n \"acc_stderr\": 0.01809424711647333,\n \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.01809424711647333\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476074,\n \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476074\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807193,\n \"acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807193\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9029126213592233,\n \"acc_stderr\": 0.02931596291881348,\n \"acc_norm\": 0.9029126213592233,\n \"acc_norm_stderr\": 0.02931596291881348\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n \"acc_stderr\": 0.016534627684311368,\n \"acc_norm\": 0.9316239316239316,\n \"acc_norm_stderr\": 0.016534627684311368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9003831417624522,\n \"acc_stderr\": 0.010709685591251671,\n \"acc_norm\": 0.9003831417624522,\n \"acc_norm_stderr\": 0.010709685591251671\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n \"acc_stderr\": 0.01663583834163192,\n \"acc_norm\": 0.4491620111731844,\n \"acc_norm_stderr\": 0.01663583834163192\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7875816993464052,\n \"acc_stderr\": 0.02342037547829613,\n \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.02342037547829613\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8135048231511254,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.8135048231511254,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8549382716049383,\n \"acc_stderr\": 0.01959487701972795,\n \"acc_norm\": 0.8549382716049383,\n \"acc_norm_stderr\": 0.01959487701972795\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.560625814863103,\n \"acc_stderr\": 0.012676014778580219,\n \"acc_norm\": 0.560625814863103,\n \"acc_norm_stderr\": 0.012676014778580219\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7757352941176471,\n \"acc_stderr\": 0.025336848563332372,\n \"acc_norm\": 0.7757352941176471,\n \"acc_norm_stderr\": 0.025336848563332372\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8022875816993464,\n \"acc_stderr\": 0.016112443369726736,\n \"acc_norm\": 0.8022875816993464,\n \"acc_norm_stderr\": 0.016112443369726736\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514279,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514279\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.0211662163046594,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.0211662163046594\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.94,\n \"acc_stderr\": 0.02386832565759419,\n \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.02386832565759419\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756544,\n \"mc2\": 0.5108013665291756,\n \"mc2_stderr\": 0.014538753767819627\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8413575374901342,\n \"acc_stderr\": 0.01026793624302822\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5670962850644428,\n \"acc_stderr\": 0.013647916362576052\n }\n}\n```", "repo_url": "https://huggingface.co/deepseek-ai/deepseek-llm-67b-base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|arc:challenge|25_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|gsm8k|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hellaswag|10_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T15-08-33.397139.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["**/details_harness|winogrande|5_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T15-08-33.397139.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T15_08_33.397139", "path": ["results_2023-12-08T15-08-33.397139.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T15-08-33.397139.parquet"]}]}]}
2023-12-08T15:11:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-67b-base ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model deepseek-ai/deepseek-llm-67b-base on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-08T15:08:33.397139(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-67b-base", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model deepseek-ai/deepseek-llm-67b-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T15:08:33.397139(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-67b-base", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model deepseek-ai/deepseek-llm-67b-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T15:08:33.397139(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 175, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-67b-base## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model deepseek-ai/deepseek-llm-67b-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-08T15:08:33.397139(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
8ea48d36e7fd155301800916b6af265bf1860113
# Sampled Trelis/big_patent_sample Dataset This is a sampled Trelis/big_patent_sample dataset containing rows of data with descriptions shorter than or equal to 60,000 characters in length.
Trelis/big_patent_60k_characters
[ "region:us" ]
2023-12-08T15:20:39+00:00
{}
2023-12-08T15:22:01+00:00
[]
[]
TAGS #region-us
# Sampled Trelis/big_patent_sample Dataset This is a sampled Trelis/big_patent_sample dataset containing rows of data with descriptions shorter than or equal to 60,000 characters in length.
[ "# Sampled Trelis/big_patent_sample Dataset\nThis is a sampled Trelis/big_patent_sample dataset containing rows of data with descriptions shorter than or equal to 60,000 characters in length." ]
[ "TAGS\n#region-us \n", "# Sampled Trelis/big_patent_sample Dataset\nThis is a sampled Trelis/big_patent_sample dataset containing rows of data with descriptions shorter than or equal to 60,000 characters in length." ]
[ 6, 55 ]
[ "passage: TAGS\n#region-us \n# Sampled Trelis/big_patent_sample Dataset\nThis is a sampled Trelis/big_patent_sample dataset containing rows of data with descriptions shorter than or equal to 60,000 characters in length." ]
a45565b586e83ff3e632461814b4f4fce523d334
# Dataset Card for "multilingual-TEDX-fr-duration" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mattlc/multilingual-TEDX-fr-duration
[ "region:us" ]
2023-12-08T15:40:02+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "file", "dtype": "string"}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "sentence", "dtype": "string"}, {"name": "speaker_id", "dtype": "string"}, {"name": "start_timestamp", "dtype": "float32"}, {"name": "end_timestamp", "dtype": "float32"}, {"name": "index", "dtype": "int32"}, {"name": "duration", "dtype": "float64"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 20290217368.375, "num_examples": 116045}, {"name": "test", "num_bytes": 179302302.625, "num_examples": 1059}, {"name": "validation", "num_bytes": 179302302.625, "num_examples": 1059}], "download_size": 20376737131, "dataset_size": 20648821973.625}}
2023-12-08T15:51:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for "multilingual-TEDX-fr-duration" More Information needed
[ "# Dataset Card for \"multilingual-TEDX-fr-duration\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"multilingual-TEDX-fr-duration\"\n\nMore Information needed" ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"multilingual-TEDX-fr-duration\"\n\nMore Information needed" ]
17ff6075a225829ee8f98856ec900b2ab474002c
# Dataset Card for Evaluation run of ehartford/dolphin-2.2-70b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/ehartford/dolphin-2.2-70b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [ehartford/dolphin-2.2-70b](https://huggingface.co/ehartford/dolphin-2.2-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ehartford__dolphin-2.2-70b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-08T15:41:29.981879](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.2-70b/blob/main/results_2023-12-08T15-41-29.981879.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6908692958524526, "acc_stderr": 0.030565557295291628, "acc_norm": 0.6948282125429526, "acc_norm_stderr": 0.031156142381821045, "mc1": 0.42472460220318237, "mc1_stderr": 0.01730400095716748, "mc2": 0.6013577707139347, "mc2_stderr": 0.014840096510342907 }, "harness|arc:challenge|25": { "acc": 0.6569965870307167, "acc_stderr": 0.013872423223718166, "acc_norm": 0.7005119453924915, "acc_norm_stderr": 0.01338502163731357 }, "harness|hellaswag|10": { "acc": 0.668990240987851, "acc_stderr": 0.004696148339570979, "acc_norm": 0.8596893049193388, "acc_norm_stderr": 0.0034659928816107746 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.04218506215368879, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.04218506215368879 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7763157894736842, "acc_stderr": 0.03391160934343603, "acc_norm": 0.7763157894736842, "acc_norm_stderr": 0.03391160934343603 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.76, "acc_stderr": 0.04292346959909284, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7245283018867924, "acc_stderr": 0.027495663683724057, "acc_norm": 0.7245283018867924, "acc_norm_stderr": 0.027495663683724057 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7986111111111112, "acc_stderr": 0.033536474697138406, "acc_norm": 0.7986111111111112, "acc_norm_stderr": 0.033536474697138406 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6763005780346821, "acc_stderr": 0.0356760379963917, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.0356760379963917 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6723404255319149, "acc_stderr": 0.03068302084323101, "acc_norm": 0.6723404255319149, "acc_norm_stderr": 0.03068302084323101 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.04685473041907789, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.04685473041907789 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6, "acc_stderr": 0.040824829046386284, "acc_norm": 0.6, "acc_norm_stderr": 0.040824829046386284 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4497354497354497, "acc_stderr": 0.02562085704293665, "acc_norm": 0.4497354497354497, "acc_norm_stderr": 0.02562085704293665 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.044444444444444495, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.044444444444444495 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8419354838709677, "acc_stderr": 0.020752831511875278, "acc_norm": 0.8419354838709677, "acc_norm_stderr": 0.020752831511875278 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5467980295566502, "acc_stderr": 0.03502544650845872, "acc_norm": 0.5467980295566502, "acc_norm_stderr": 0.03502544650845872 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.78, "acc_stderr": 0.04163331998932262, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932262 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8242424242424242, "acc_stderr": 0.02972094300622445, "acc_norm": 0.8242424242424242, "acc_norm_stderr": 0.02972094300622445 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8787878787878788, "acc_stderr": 0.023253157951942084, "acc_norm": 0.8787878787878788, "acc_norm_stderr": 0.023253157951942084 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9222797927461139, "acc_stderr": 0.01932180555722313, "acc_norm": 0.9222797927461139, "acc_norm_stderr": 0.01932180555722313 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6974358974358974, "acc_stderr": 0.02329088805377272, "acc_norm": 0.6974358974358974, "acc_norm_stderr": 0.02329088805377272 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3037037037037037, "acc_stderr": 0.028037929969114986, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.028037929969114986 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7605042016806722, "acc_stderr": 0.02772206549336126, "acc_norm": 0.7605042016806722, "acc_norm_stderr": 0.02772206549336126 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4370860927152318, "acc_stderr": 0.04050035722230636, "acc_norm": 0.4370860927152318, "acc_norm_stderr": 0.04050035722230636 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8862385321100917, "acc_stderr": 0.013613614800232808, "acc_norm": 0.8862385321100917, "acc_norm_stderr": 0.013613614800232808 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5740740740740741, "acc_stderr": 0.033723432716530624, "acc_norm": 0.5740740740740741, "acc_norm_stderr": 0.033723432716530624 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9166666666666666, "acc_stderr": 0.019398452135813905, "acc_norm": 0.9166666666666666, "acc_norm_stderr": 0.019398452135813905 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8818565400843882, "acc_stderr": 0.021011052659878467, "acc_norm": 0.8818565400843882, "acc_norm_stderr": 0.021011052659878467 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7713004484304933, "acc_stderr": 0.028188240046929196, "acc_norm": 0.7713004484304933, "acc_norm_stderr": 0.028188240046929196 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8625954198473282, "acc_stderr": 0.03019482399680448, "acc_norm": 0.8625954198473282, "acc_norm_stderr": 0.03019482399680448 }, "harness|hendrycksTest-international_law|5": { "acc": 0.859504132231405, "acc_stderr": 0.03172233426002158, "acc_norm": 0.859504132231405, "acc_norm_stderr": 0.03172233426002158 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8148148148148148, "acc_stderr": 0.03755265865037181, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.03755265865037181 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8282208588957055, "acc_stderr": 0.029634717272371033, "acc_norm": 0.8282208588957055, "acc_norm_stderr": 0.029634717272371033 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8543689320388349, "acc_stderr": 0.03492606476623792, "acc_norm": 0.8543689320388349, "acc_norm_stderr": 0.03492606476623792 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092368, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092368 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8492975734355045, "acc_stderr": 0.01279342088312082, "acc_norm": 0.8492975734355045, "acc_norm_stderr": 0.01279342088312082 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8034682080924855, "acc_stderr": 0.021393961404363847, "acc_norm": 0.8034682080924855, "acc_norm_stderr": 0.021393961404363847 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.587709497206704, "acc_stderr": 0.016463200238114515, "acc_norm": 0.587709497206704, "acc_norm_stderr": 0.016463200238114515 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7320261437908496, "acc_stderr": 0.025360603796242553, "acc_norm": 0.7320261437908496, "acc_norm_stderr": 0.025360603796242553 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7620578778135049, "acc_stderr": 0.02418515064781871, "acc_norm": 0.7620578778135049, "acc_norm_stderr": 0.02418515064781871 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8148148148148148, "acc_stderr": 0.0216138093952248, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.0216138093952248 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5460992907801419, "acc_stderr": 0.029700453247291477, "acc_norm": 0.5460992907801419, "acc_norm_stderr": 0.029700453247291477 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5430247718383312, "acc_stderr": 0.012722869501611419, "acc_norm": 0.5430247718383312, "acc_norm_stderr": 0.012722869501611419 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6875, "acc_stderr": 0.02815637344037142, "acc_norm": 0.6875, "acc_norm_stderr": 0.02815637344037142 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7434640522875817, "acc_stderr": 0.017667841612379005, "acc_norm": 0.7434640522875817, "acc_norm_stderr": 0.017667841612379005 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7636363636363637, "acc_stderr": 0.040693063197213754, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.040693063197213754 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7836734693877551, "acc_stderr": 0.02635891633490402, "acc_norm": 0.7836734693877551, "acc_norm_stderr": 0.02635891633490402 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8955223880597015, "acc_stderr": 0.021628920516700637, "acc_norm": 0.8955223880597015, "acc_norm_stderr": 0.021628920516700637 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.038786267710023595, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.038786267710023595 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8596491228070176, "acc_stderr": 0.0266405825391332, "acc_norm": 0.8596491228070176, "acc_norm_stderr": 0.0266405825391332 }, "harness|truthfulqa:mc|0": { "mc1": 0.42472460220318237, "mc1_stderr": 0.01730400095716748, "mc2": 0.6013577707139347, "mc2_stderr": 0.014840096510342907 }, "harness|winogrande|5": { "acc": 0.8145224940805051, "acc_stderr": 0.010923965303140505 }, "harness|gsm8k|5": { "acc": 0.5678544351781653, "acc_stderr": 0.013645072137842443 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_ehartford__dolphin-2.2-70b
[ "region:us" ]
2023-12-08T15:44:30+00:00
{"pretty_name": "Evaluation run of ehartford/dolphin-2.2-70b", "dataset_summary": "Dataset automatically created during the evaluation run of model [ehartford/dolphin-2.2-70b](https://huggingface.co/ehartford/dolphin-2.2-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__dolphin-2.2-70b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T15:41:29.981879](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.2-70b/blob/main/results_2023-12-08T15-41-29.981879.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6908692958524526,\n \"acc_stderr\": 0.030565557295291628,\n \"acc_norm\": 0.6948282125429526,\n \"acc_norm_stderr\": 0.031156142381821045,\n \"mc1\": 0.42472460220318237,\n \"mc1_stderr\": 0.01730400095716748,\n \"mc2\": 0.6013577707139347,\n \"mc2_stderr\": 0.014840096510342907\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6569965870307167,\n \"acc_stderr\": 0.013872423223718166,\n \"acc_norm\": 0.7005119453924915,\n \"acc_norm_stderr\": 0.01338502163731357\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.668990240987851,\n \"acc_stderr\": 0.004696148339570979,\n \"acc_norm\": 0.8596893049193388,\n \"acc_norm_stderr\": 0.0034659928816107746\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343603,\n \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343603\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.7986111111111112,\n \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6723404255319149,\n \"acc_stderr\": 0.03068302084323101,\n \"acc_norm\": 0.6723404255319149,\n \"acc_norm_stderr\": 0.03068302084323101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4497354497354497,\n \"acc_stderr\": 0.02562085704293665,\n \"acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.02562085704293665\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8419354838709677,\n \"acc_stderr\": 0.020752831511875278,\n \"acc_norm\": 0.8419354838709677,\n \"acc_norm_stderr\": 0.020752831511875278\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5467980295566502,\n \"acc_stderr\": 0.03502544650845872,\n \"acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.03502544650845872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942084,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942084\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.01932180555722313,\n \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.01932180555722313\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6974358974358974,\n \"acc_stderr\": 0.02329088805377272,\n \"acc_norm\": 0.6974358974358974,\n \"acc_norm_stderr\": 0.02329088805377272\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.02772206549336126,\n \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.02772206549336126\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8862385321100917,\n \"acc_stderr\": 0.013613614800232808,\n \"acc_norm\": 0.8862385321100917,\n \"acc_norm_stderr\": 0.013613614800232808\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878467,\n \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878467\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7713004484304933,\n \"acc_stderr\": 0.028188240046929196,\n \"acc_norm\": 0.7713004484304933,\n \"acc_norm_stderr\": 0.028188240046929196\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.03019482399680448,\n \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.03019482399680448\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\": 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.029634717272371033,\n \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.029634717272371033\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623792,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623792\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8492975734355045,\n \"acc_stderr\": 0.01279342088312082,\n \"acc_norm\": 0.8492975734355045,\n \"acc_norm_stderr\": 0.01279342088312082\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8034682080924855,\n \"acc_stderr\": 0.021393961404363847,\n \"acc_norm\": 0.8034682080924855,\n \"acc_norm_stderr\": 0.021393961404363847\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.587709497206704,\n \"acc_stderr\": 0.016463200238114515,\n \"acc_norm\": 0.587709497206704,\n \"acc_norm_stderr\": 0.016463200238114515\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242553,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242553\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7620578778135049,\n \"acc_stderr\": 0.02418515064781871,\n \"acc_norm\": 0.7620578778135049,\n \"acc_norm_stderr\": 0.02418515064781871\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.0216138093952248,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.0216138093952248\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291477,\n \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291477\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5430247718383312,\n \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.5430247718383312,\n \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7434640522875817,\n \"acc_stderr\": 0.017667841612379005,\n \"acc_norm\": 0.7434640522875817,\n \"acc_norm_stderr\": 0.017667841612379005\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.040693063197213754,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.040693063197213754\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.02635891633490402,\n \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.02635891633490402\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700637,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700637\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42472460220318237,\n \"mc1_stderr\": 0.01730400095716748,\n \"mc2\": 0.6013577707139347,\n \"mc2_stderr\": 0.014840096510342907\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5678544351781653,\n \"acc_stderr\": 0.013645072137842443\n }\n}\n```", "repo_url": "https://huggingface.co/ehartford/dolphin-2.2-70b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|arc:challenge|25_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|gsm8k|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hellaswag|10_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T15-41-29.981879.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["**/details_harness|winogrande|5_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T15-41-29.981879.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T15_41_29.981879", "path": ["results_2023-12-08T15-41-29.981879.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T15-41-29.981879.parquet"]}]}]}
2023-12-08T15:45:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ehartford/dolphin-2.2-70b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model ehartford/dolphin-2.2-70b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-08T15:41:29.981879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of ehartford/dolphin-2.2-70b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ehartford/dolphin-2.2-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T15:41:29.981879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ehartford/dolphin-2.2-70b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ehartford/dolphin-2.2-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T15:41:29.981879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 169, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ehartford/dolphin-2.2-70b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ehartford/dolphin-2.2-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-08T15:41:29.981879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
74576766cf32f2fbb76b0bc91a0aef3e1c421de2
The dataset is used for classifying portrait images in anime videos and is specifically divided into the following four categories: * **Vision**: Overall character images (may include instances of facial drawing issues; not recommended to split when training LoRA). * **Imagery**: Clear character images (both facial and upper-body features are clear). * **Halfbody**: Upper-body character images (mostly dominated by the head, with some partial upper-body features). * **Face**: Close-up shots of character faces (close-up shots of faces with highly clear features in appearance and expression). | Dataset | Vision | Imagery | Halfbody | Face | |---------|--------|---------|----------|-------| | v0 | 46866 | 49671 | 49685 | 49438 |
deepghs/bangumi_char_type
[ "task_categories:image-classification", "size_categories:100K<n<1M", "license:openrail", "art", "region:us" ]
2023-12-08T16:17:48+00:00
{"license": "openrail", "size_categories": ["100K<n<1M"], "task_categories": ["image-classification"], "tags": ["art"]}
2023-12-09T06:21:38+00:00
[]
[]
TAGS #task_categories-image-classification #size_categories-100K<n<1M #license-openrail #art #region-us
The dataset is used for classifying portrait images in anime videos and is specifically divided into the following four categories: * Vision: Overall character images (may include instances of facial drawing issues; not recommended to split when training LoRA). * Imagery: Clear character images (both facial and upper-body features are clear). * Halfbody: Upper-body character images (mostly dominated by the head, with some partial upper-body features). * Face: Close-up shots of character faces (close-up shots of faces with highly clear features in appearance and expression).
[]
[ "TAGS\n#task_categories-image-classification #size_categories-100K<n<1M #license-openrail #art #region-us \n" ]
[ 37 ]
[ "passage: TAGS\n#task_categories-image-classification #size_categories-100K<n<1M #license-openrail #art #region-us \n" ]
70ac9a7aa43ff254268c8e8e4c9d8252b668ffc9
<p align="center"><h1>🧠 Awesome Prompts [CSV dataset]</h1></p> Based on [awesome-prompt](https://huggingface.co/datasets/fka/awesome-chatgpt-prompts) by [fka](https://huggingface.co/fka).
njfamirm/prompt
[ "license:mit", "prompt", "region:us" ]
2023-12-08T16:41:28+00:00
{"license": "mit", "tags": ["prompt"]}
2023-12-08T19:28:46+00:00
[]
[]
TAGS #license-mit #prompt #region-us
<p align="center"><h1> Awesome Prompts [CSV dataset]</h1></p> Based on awesome-prompt by fka.
[]
[ "TAGS\n#license-mit #prompt #region-us \n" ]
[ 15 ]
[ "passage: TAGS\n#license-mit #prompt #region-us \n" ]
c544e1d412ee2915a671b6ad53c276ac1d30e4d7
# AlpcaCode This is a version of [Alpaca Code](https://huggingface.co/datasets/TokenBender/code_instructions_122k_alpaca_style) formatted for instruction fine-tuning using the following prompt template: ``` ### Instruction: Instruction ### Input: Input ### Response: ```
mwitiderrick/AlpacaCode
[ "task_categories:question-answering", "task_categories:text-generation", "size_categories:100K<n<1M", "language:en", "license:mit", "region:us" ]
2023-12-08T17:16:14+00:00
{"language": ["en"], "license": "mit", "size_categories": ["100K<n<1M"], "task_categories": ["question-answering", "text-generation"], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4817562, "num_examples": 1073}], "download_size": 1633970, "dataset_size": 4817562}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-09T18:26:18+00:00
[]
[ "en" ]
TAGS #task_categories-question-answering #task_categories-text-generation #size_categories-100K<n<1M #language-English #license-mit #region-us
# AlpcaCode This is a version of Alpaca Code formatted for instruction fine-tuning using the following prompt template:
[ "# AlpcaCode\nThis is a version of Alpaca Code formatted for instruction fine-tuning using the following \nprompt template:" ]
[ "TAGS\n#task_categories-question-answering #task_categories-text-generation #size_categories-100K<n<1M #language-English #license-mit #region-us \n", "# AlpcaCode\nThis is a version of Alpaca Code formatted for instruction fine-tuning using the following \nprompt template:" ]
[ 50, 27 ]
[ "passage: TAGS\n#task_categories-question-answering #task_categories-text-generation #size_categories-100K<n<1M #language-English #license-mit #region-us \n# AlpcaCode\nThis is a version of Alpaca Code formatted for instruction fine-tuning using the following \nprompt template:" ]
bf950f46bbd0f4ce6f9e445e09edf938d1cdc1ff
## Dataset Summary The origin dataset from: [PhoST](https://github.com/VinAIResearch/PhoST) A speech-to-text translation dataset with source audio in English and target sentences in Vietnamese. In this STVi dataset, we split the origin audio into 1-15s corresponding to each sentence and spelling correction, normalization word, audio segmentation ## Data Structure ``` {'path': 'stvi/train/waves/315814.wav', 'audio': {'path': 'stvi/train/waves/315814.wav', 'array': array([-0.0050354 , -0.00296021, -0.00549316, ..., -0.0085144 , -0.00686646, -0.00018311]), 'sampling_rate': 16000}, 'sentence': 'tôi cảm thấy bẽ mặt và xấu hổ đến mức tôi chạy ngay về nhà với mẹ và trách phạt bà vì để tôi mặc cái áo gớm guốc'} ``` ### Data Fields - path: The path to the audio file - audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. - sentence: The transcript sentence ### Data Splits The speech material has been subdivided into portions for train and test. The speech was split from a TED Talk, and each speech segment corresponds to a transcript sentence. | | Train | Test | | ------------------- | ----- | ----- | | Utterances | 294723| 1054 | | Duration (in hours) | 425.91| 1.616 |
TruongScotl/stvi
[ "task_categories:translation", "size_categories:100K<n<1M", "source_datasets:PhoST - VinAI", "language:vi", "license:cc-by-nc-4.0", "region:us" ]
2023-12-08T17:56:04+00:00
{"language": ["vi"], "license": "cc-by-nc-4.0", "size_categories": ["100K<n<1M"], "source_datasets": ["PhoST - VinAI"], "task_categories": ["translation"], "pretty_name": "Speech to Text Translation Vietnamese", "dataset_info": {"features": [{"name": "path", "dtype": "string"}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "sentence", "dtype": "string"}], "splits": [{"name": "train", "num_examples": 294723}, {"name": "test", "num_examples": 1095}]}}
2024-01-09T07:27:46+00:00
[]
[ "vi" ]
TAGS #task_categories-translation #size_categories-100K<n<1M #source_datasets-PhoST - VinAI #language-Vietnamese #license-cc-by-nc-4.0 #region-us
Dataset Summary --------------- The origin dataset from: PhoST A speech-to-text translation dataset with source audio in English and target sentences in Vietnamese. In this STVi dataset, we split the origin audio into 1-15s corresponding to each sentence and spelling correction, normalization word, audio segmentation Data Structure -------------- ### Data Fields * path: The path to the audio file * audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. * sentence: The transcript sentence ### Data Splits The speech material has been subdivided into portions for train and test. The speech was split from a TED Talk, and each speech segment corresponds to a transcript sentence. Train: Utterances, Test: 294723 Train: Duration (in hours), Test: 425.91
[ "### Data Fields\n\n\n* path: The path to the audio file\n* audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate.\n* sentence: The transcript sentence", "### Data Splits\n\n\nThe speech material has been subdivided into portions for train and test.\n\n\nThe speech was split from a TED Talk, and each speech segment corresponds to a transcript sentence.\n\n\nTrain: Utterances, Test: 294723\nTrain: Duration (in hours), Test: 425.91" ]
[ "TAGS\n#task_categories-translation #size_categories-100K<n<1M #source_datasets-PhoST - VinAI #language-Vietnamese #license-cc-by-nc-4.0 #region-us \n", "### Data Fields\n\n\n* path: The path to the audio file\n* audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate.\n* sentence: The transcript sentence", "### Data Splits\n\n\nThe speech material has been subdivided into portions for train and test.\n\n\nThe speech was split from a TED Talk, and each speech segment corresponds to a transcript sentence.\n\n\nTrain: Utterances, Test: 294723\nTrain: Duration (in hours), Test: 425.91" ]
[ 58, 54, 67 ]
[ "passage: TAGS\n#task_categories-translation #size_categories-100K<n<1M #source_datasets-PhoST - VinAI #language-Vietnamese #license-cc-by-nc-4.0 #region-us \n### Data Fields\n\n\n* path: The path to the audio file\n* audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate.\n* sentence: The transcript sentence### Data Splits\n\n\nThe speech material has been subdivided into portions for train and test.\n\n\nThe speech was split from a TED Talk, and each speech segment corresponds to a transcript sentence.\n\n\nTrain: Utterances, Test: 294723\nTrain: Duration (in hours), Test: 425.91" ]
cb83312f92112dd895853b6e207a07fe6af06370
# Dataset Card for "simple_dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
xwjiang2010/simple_dataset
[ "region:us" ]
2023-12-08T18:03:35+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "input", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 14, "num_examples": 2}], "download_size": 760, "dataset_size": 14}}
2023-12-08T18:27:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for "simple_dataset" More Information needed
[ "# Dataset Card for \"simple_dataset\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"simple_dataset\"\n\nMore Information needed" ]
[ 6, 14 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"simple_dataset\"\n\nMore Information needed" ]
fab89ce74bf53152e1e1c0d7758d57ae5d8fe78b
This is a dataset for finetuning models on function calling based on [glaiveai/glaive-function-calling-v2](https://huggingface.co/datasets/glaiveai/glaive-function-calling-v2). The dataset includes 86,864 examples of chats that include function calling as part of the conversation. The system prompt includes either 0, 1, or 2 functions that the assistant can use, and instructions on how the agent can use it. Changes include: * Using ShareGPT format for chats * Adding "function_response" as a role * Removing code examples * Removing examples with invalid JSON as function calls / responses * Updating system message to include instructions on how to do function calls
hypervariance/function-calling-sharegpt
[ "task_categories:text-generation", "size_categories:10K<n<100K", "language:en", "license:apache-2.0", "function-calling", "function-call", "functions", "region:us" ]
2023-12-08T18:16:58+00:00
{"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation"], "tags": ["function-calling", "function-call", "functions"]}
2023-12-08T18:25:19+00:00
[]
[ "en" ]
TAGS #task_categories-text-generation #size_categories-10K<n<100K #language-English #license-apache-2.0 #function-calling #function-call #functions #region-us
This is a dataset for finetuning models on function calling based on glaiveai/glaive-function-calling-v2. The dataset includes 86,864 examples of chats that include function calling as part of the conversation. The system prompt includes either 0, 1, or 2 functions that the assistant can use, and instructions on how the agent can use it. Changes include: * Using ShareGPT format for chats * Adding "function_response" as a role * Removing code examples * Removing examples with invalid JSON as function calls / responses * Updating system message to include instructions on how to do function calls
[]
[ "TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #language-English #license-apache-2.0 #function-calling #function-call #functions #region-us \n" ]
[ 53 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #language-English #license-apache-2.0 #function-calling #function-call #functions #region-us \n" ]
555bad7968768b5e2b13b691c13f1e854452f312
# Dataset Card for "french_multicorpus_tft_v040" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mattlc/french_multicorpus_tft_v040
[ "region:us" ]
2023-12-08T18:48:53+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "audio_id", "dtype": "string"}, {"name": "language", "dtype": {"class_label": {"names": {"0": "en", "1": "de", "2": "fr", "3": "es", "4": "pl", "5": "it", "6": "ro", "7": "hu", "8": "cs", "9": "nl", "10": "fi", "11": "hr", "12": "sk", "13": "sl", "14": "et", "15": "lt", "16": "en_accented"}}}}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "raw_text", "dtype": "string"}, {"name": "normalized_text", "dtype": "string"}, {"name": "gender", "dtype": "string"}, {"name": "speaker_id", "dtype": "string"}, {"name": "is_gold_transcript", "dtype": "bool"}, {"name": "accent", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "duration", "dtype": "float64"}, {"name": "dataset", "dtype": "string"}, {"name": "sentence", "dtype": "string"}, {"name": "file", "dtype": "string"}, {"name": "start_timestamp", "dtype": "float32"}, {"name": "end_timestamp", "dtype": "float32"}, {"name": "index", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 4943753305.625, "num_examples": 18475}, {"name": "test", "num_bytes": 644740130.762, "num_examples": 2613}], "download_size": 5536852622, "dataset_size": 5588493436.387}}
2023-12-10T15:53:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for "french_multicorpus_tft_v040" More Information needed
[ "# Dataset Card for \"french_multicorpus_tft_v040\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"french_multicorpus_tft_v040\"\n\nMore Information needed" ]
[ 6, 24 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"french_multicorpus_tft_v040\"\n\nMore Information needed" ]
cb8339ddee1ea74be11f9f1419e96a2975402384
Check https://github.com/JeffersonQin/DungeonAssistant for details.
gyrojeff/DungeonAssistant
[ "license:mit", "region:us" ]
2023-12-08T19:56:04+00:00
{"license": "mit"}
2023-12-17T03:47:49+00:00
[]
[]
TAGS #license-mit #region-us
Check URL for details.
[]
[ "TAGS\n#license-mit #region-us \n" ]
[ 11 ]
[ "passage: TAGS\n#license-mit #region-us \n" ]
fccfa53893dc28af7fbbfeb14387cdcdde4ff28f
ALERT: this dataset also contains all of the CLOVER dataset Here is the distribution of hints across the 343 files in this dataset: ![image/png](https://cdn-uploads.huggingface.co/production/uploads/633ca1df6abebdd9dab67177/xi3f_unD0tfbQvqd5GvF2.png) Here is the distribution of non-comment code lines across the 343 files in this dataset: ![image/png](https://cdn-uploads.huggingface.co/production/uploads/633ca1df6abebdd9dab67177/4x29-CWz_vw1Xn9CKy3AG.png)
metareflection/dafny_with_hints
[ "region:us" ]
2023-12-08T20:05:19+00:00
{}
2023-12-08T20:33:48+00:00
[]
[]
TAGS #region-us
ALERT: this dataset also contains all of the CLOVER dataset Here is the distribution of hints across the 343 files in this dataset: !image/png Here is the distribution of non-comment code lines across the 343 files in this dataset: !image/png
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
ddce07d36c14bebcea99dd27f0d8633dbf79fb6b
## Description The latest AI movie trailers for the latent space 🔥 ## Model SVD ## LoRA jbilcke-hf/sdxl-cinematic-2 ## Voice Julian ## Music Intense movie trailer cinematic ## Prompt A video channel which produces movie trailer of fictional movies. Typical trailer will be about adventure, action, thriller, science fiction movies etc (so pretty much everything!). The voice over commentary should imitate the style of movie trailer "this summer.." etc (but you can invent your own trailer commentary style!). Trailers should NEVER mention or be about existing cinema artists or movie studio (if you do this, you will be FIRED!) Instead invent your own movie names and trailer stories, content, characters etc Characters should not be biased towards any specific gender, work, country, religion or culture.
jbilcke-hf/ai-tube-trailer-fury
[ "license:cc-by-nc-sa-4.0", "region:us" ]
2023-12-08T20:05:25+00:00
{"license": "cc-by-nc-sa-4.0", "pretty_name": "Trailer Fury \ud83c\udf9e\ufe0f"}
2023-12-12T22:37:32+00:00
[]
[]
TAGS #license-cc-by-nc-sa-4.0 #region-us
## Description The latest AI movie trailers for the latent space ## Model SVD ## LoRA jbilcke-hf/sdxl-cinematic-2 ## Voice Julian ## Music Intense movie trailer cinematic ## Prompt A video channel which produces movie trailer of fictional movies. Typical trailer will be about adventure, action, thriller, science fiction movies etc (so pretty much everything!). The voice over commentary should imitate the style of movie trailer "this summer.." etc (but you can invent your own trailer commentary style!). Trailers should NEVER mention or be about existing cinema artists or movie studio (if you do this, you will be FIRED!) Instead invent your own movie names and trailer stories, content, characters etc Characters should not be biased towards any specific gender, work, country, religion or culture.
[ "## Description\n\nThe latest AI movie trailers for the latent space", "## Model\n\nSVD", "## LoRA\n\njbilcke-hf/sdxl-cinematic-2", "## Voice\n\nJulian", "## Music\n\nIntense movie trailer cinematic", "## Prompt\n\nA video channel which produces movie trailer of fictional movies.\nTypical trailer will be about adventure, action, thriller, science fiction movies etc (so pretty much everything!).\nThe voice over commentary should imitate the style of movie trailer \"this summer..\" etc (but you can invent your own trailer commentary style!).\n\nTrailers should NEVER mention or be about existing cinema artists or movie studio (if you do this, you will be FIRED!)\n\nInstead invent your own movie names and trailer stories, content, characters etc\nCharacters should not be biased towards any specific gender, work, country, religion or culture." ]
[ "TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n", "## Description\n\nThe latest AI movie trailers for the latent space", "## Model\n\nSVD", "## LoRA\n\njbilcke-hf/sdxl-cinematic-2", "## Voice\n\nJulian", "## Music\n\nIntense movie trailer cinematic", "## Prompt\n\nA video channel which produces movie trailer of fictional movies.\nTypical trailer will be about adventure, action, thriller, science fiction movies etc (so pretty much everything!).\nThe voice over commentary should imitate the style of movie trailer \"this summer..\" etc (but you can invent your own trailer commentary style!).\n\nTrailers should NEVER mention or be about existing cinema artists or movie studio (if you do this, you will be FIRED!)\n\nInstead invent your own movie names and trailer stories, content, characters etc\nCharacters should not be biased towards any specific gender, work, country, religion or culture." ]
[ 19, 13, 4, 17, 3, 8, 133 ]
[ "passage: TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n## Description\n\nThe latest AI movie trailers for the latent space## Model\n\nSVD## LoRA\n\njbilcke-hf/sdxl-cinematic-2## Voice\n\nJulian## Music\n\nIntense movie trailer cinematic## Prompt\n\nA video channel which produces movie trailer of fictional movies.\nTypical trailer will be about adventure, action, thriller, science fiction movies etc (so pretty much everything!).\nThe voice over commentary should imitate the style of movie trailer \"this summer..\" etc (but you can invent your own trailer commentary style!).\n\nTrailers should NEVER mention or be about existing cinema artists or movie studio (if you do this, you will be FIRED!)\n\nInstead invent your own movie names and trailer stories, content, characters etc\nCharacters should not be biased towards any specific gender, work, country, religion or culture." ]
9bdf0c3f75df9bbc9ff4b4b0c05644dc3e34428f
<h1 align="center"> 🎭 Roleplay TTL</h1> <p align="center"> <img src="https://bots-ttl.s3.amazonaws.com/intro1.png" alt="Your Image" width="500"> </p> <p align="center">Let AI be any characters you want to play with!</p> ## Dataset Overview This dataset trains conversational AI to embody a wide range of original characters, each with a unique persona. It includes fictional characters, complete with their own backgrounds, core traits, relationships, goals, and distinct speaking styles. ## Dataset Details - **Curated by:** [Hieu Minh Nguyen](mywebleo.com) - **Language(s) (NLP):** Primarily English (with potential for multilingual extensions) - **License:** Creative Commons Attribution 4.0 International License - **Version:** 1.0 (The new version will be updated soon with topics included for the dataset and 10000+ more entries.) ## Dataset Description ### The dataset includes: - Name and the description of the character. - System messages that define each character's persona. - Conversational exchanges demonstrating typical reactions in various scenarios. - Coverage of different emotions and topics, with direct quotes and signature linguistic ticks. - Includes a wide array of characters, ranging from well-known fictional figures to **completely original, self-created personas**. #### Dataset Composition - **Number of Rows:** Over 5000 entries, each representing a unique interaction. - **Interaction Style:** Each dataset entry consists of a system message defining the character's traits, followed by 3-5 conversational exchanges between the character and a user. #### Dataset Goals and Applications - **Training Objectives:** Ideal for training AI in role-playing applications, chatbots, interactive storytelling, and creative writing tools. - **Research Value:** Useful for studies in character-driven narrative generation, conversational AI, and creative writing in AI. - **Out-of-Scope Use:** Not suited for tasks unrelated to conversational or creative AI. #### Conversational Dynamics - **Realism in Dialogue:** Each exchange is crafted to mirror realistic conversations, maintaining the authenticity of characters' voices. - **Language Variability:** Diverse linguistic styles and dialects are used, tailored to each character's background and persona. - **Humor and Wit:** Includes witty banter and humorous exchanges, adding a layer of entertainment and relatability. ## Dataset Structure - `name`: Name of the character. - `description`: Detailed description of the character's persona. - `text`: Corresponding responses in the character's unique style. The "text" dataset is formatted as follows (the system message and 4-5 following conversations): <|system|>...</s>\n<|user|>...</s>\n<|assistant|>...</s>\n<|user|>\n<|assistant|>...</s> ## Data Creation and Processing Characters are created using imaginative writing of [Gemini Pro](https://deepmind.google/technologies/gemini/#build-with-gemini), ensuring a diverse range of personas. Conversations are scripted to reflect different scenarios, emotions, and interactions. ---
hieunguyenminh/roleplay
[ "task_categories:conversational", "task_categories:text-generation", "task_categories:question-answering", "size_categories:1K<n<10K", "language:en", "license:cc-by-4.0", "roleplay", "characters", "region:us" ]
2023-12-08T20:26:52+00:00
{"language": ["en"], "license": "cc-by-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["conversational", "text-generation", "question-answering"], "dataset_info": {"features": [{"name": "name", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 14924724, "num_examples": 5755}], "download_size": 2153926, "dataset_size": 14924724}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["roleplay", "characters"]}
2024-01-19T17:19:17+00:00
[]
[ "en" ]
TAGS #task_categories-conversational #task_categories-text-generation #task_categories-question-answering #size_categories-1K<n<10K #language-English #license-cc-by-4.0 #roleplay #characters #region-us
<h1 align="center"> Roleplay TTL</h1> <p align="center"> <img src="URL alt="Your Image" width="500"> </p> <p align="center">Let AI be any characters you want to play with!</p> ## Dataset Overview This dataset trains conversational AI to embody a wide range of original characters, each with a unique persona. It includes fictional characters, complete with their own backgrounds, core traits, relationships, goals, and distinct speaking styles. ## Dataset Details - Curated by: Hieu Minh Nguyen - Language(s) (NLP): Primarily English (with potential for multilingual extensions) - License: Creative Commons Attribution 4.0 International License - Version: 1.0 (The new version will be updated soon with topics included for the dataset and 10000+ more entries.) ## Dataset Description ### The dataset includes: - Name and the description of the character. - System messages that define each character's persona. - Conversational exchanges demonstrating typical reactions in various scenarios. - Coverage of different emotions and topics, with direct quotes and signature linguistic ticks. - Includes a wide array of characters, ranging from well-known fictional figures to completely original, self-created personas. #### Dataset Composition - Number of Rows: Over 5000 entries, each representing a unique interaction. - Interaction Style: Each dataset entry consists of a system message defining the character's traits, followed by 3-5 conversational exchanges between the character and a user. #### Dataset Goals and Applications - Training Objectives: Ideal for training AI in role-playing applications, chatbots, interactive storytelling, and creative writing tools. - Research Value: Useful for studies in character-driven narrative generation, conversational AI, and creative writing in AI. - Out-of-Scope Use: Not suited for tasks unrelated to conversational or creative AI. #### Conversational Dynamics - Realism in Dialogue: Each exchange is crafted to mirror realistic conversations, maintaining the authenticity of characters' voices. - Language Variability: Diverse linguistic styles and dialects are used, tailored to each character's background and persona. - Humor and Wit: Includes witty banter and humorous exchanges, adding a layer of entertainment and relatability. ## Dataset Structure - 'name': Name of the character. - 'description': Detailed description of the character's persona. - 'text': Corresponding responses in the character's unique style. The "text" dataset is formatted as follows (the system message and 4-5 following conversations): <|system|>...</s>\n<|user|>...</s>\n<|assistant|>...</s>\n<|user|>\n<|assistant|>...</s> ## Data Creation and Processing Characters are created using imaginative writing of Gemini Pro, ensuring a diverse range of personas. Conversations are scripted to reflect different scenarios, emotions, and interactions. ---
[ "## Dataset Overview\n\nThis dataset trains conversational AI to embody a wide range of original characters, each with a unique persona. It includes fictional characters, complete with their own backgrounds, core traits, relationships, goals, and distinct speaking styles.", "## Dataset Details\n\n- Curated by: Hieu Minh Nguyen\n- Language(s) (NLP): Primarily English (with potential for multilingual extensions)\n- License: Creative Commons Attribution 4.0 International License\n- Version: 1.0 (The new version will be updated soon with topics included for the dataset and 10000+ more entries.)", "## Dataset Description", "### The dataset includes:\n- Name and the description of the character.\n- System messages that define each character's persona.\n- Conversational exchanges demonstrating typical reactions in various scenarios.\n- Coverage of different emotions and topics, with direct quotes and signature linguistic ticks.\n- Includes a wide array of characters, ranging from well-known fictional figures to completely original, self-created personas.", "#### Dataset Composition\n- Number of Rows: Over 5000 entries, each representing a unique interaction.\n- Interaction Style: Each dataset entry consists of a system message defining the character's traits, followed by 3-5 conversational exchanges between the character and a user.", "#### Dataset Goals and Applications\n- Training Objectives: Ideal for training AI in role-playing applications, chatbots, interactive storytelling, and creative writing tools.\n- Research Value: Useful for studies in character-driven narrative generation, conversational AI, and creative writing in AI.\n- Out-of-Scope Use: Not suited for tasks unrelated to conversational or creative AI.", "#### Conversational Dynamics\n- Realism in Dialogue: Each exchange is crafted to mirror realistic conversations, maintaining the authenticity of characters' voices.\n- Language Variability: Diverse linguistic styles and dialects are used, tailored to each character's background and persona.\n- Humor and Wit: Includes witty banter and humorous exchanges, adding a layer of entertainment and relatability.", "## Dataset Structure\n\n- 'name': Name of the character.\n- 'description': Detailed description of the character's persona. \n- 'text': Corresponding responses in the character's unique style.\n \nThe \"text\" dataset is formatted as follows (the system message and 4-5 following conversations):\n<|system|>...</s>\\n<|user|>...</s>\\n<|assistant|>...</s>\\n<|user|>\\n<|assistant|>...</s>", "## Data Creation and Processing\n\nCharacters are created using imaginative writing of Gemini Pro, ensuring a diverse range of personas. Conversations are scripted to reflect different scenarios, emotions, and interactions.\n\n---" ]
[ "TAGS\n#task_categories-conversational #task_categories-text-generation #task_categories-question-answering #size_categories-1K<n<10K #language-English #license-cc-by-4.0 #roleplay #characters #region-us \n", "## Dataset Overview\n\nThis dataset trains conversational AI to embody a wide range of original characters, each with a unique persona. It includes fictional characters, complete with their own backgrounds, core traits, relationships, goals, and distinct speaking styles.", "## Dataset Details\n\n- Curated by: Hieu Minh Nguyen\n- Language(s) (NLP): Primarily English (with potential for multilingual extensions)\n- License: Creative Commons Attribution 4.0 International License\n- Version: 1.0 (The new version will be updated soon with topics included for the dataset and 10000+ more entries.)", "## Dataset Description", "### The dataset includes:\n- Name and the description of the character.\n- System messages that define each character's persona.\n- Conversational exchanges demonstrating typical reactions in various scenarios.\n- Coverage of different emotions and topics, with direct quotes and signature linguistic ticks.\n- Includes a wide array of characters, ranging from well-known fictional figures to completely original, self-created personas.", "#### Dataset Composition\n- Number of Rows: Over 5000 entries, each representing a unique interaction.\n- Interaction Style: Each dataset entry consists of a system message defining the character's traits, followed by 3-5 conversational exchanges between the character and a user.", "#### Dataset Goals and Applications\n- Training Objectives: Ideal for training AI in role-playing applications, chatbots, interactive storytelling, and creative writing tools.\n- Research Value: Useful for studies in character-driven narrative generation, conversational AI, and creative writing in AI.\n- Out-of-Scope Use: Not suited for tasks unrelated to conversational or creative AI.", "#### Conversational Dynamics\n- Realism in Dialogue: Each exchange is crafted to mirror realistic conversations, maintaining the authenticity of characters' voices.\n- Language Variability: Diverse linguistic styles and dialects are used, tailored to each character's background and persona.\n- Humor and Wit: Includes witty banter and humorous exchanges, adding a layer of entertainment and relatability.", "## Dataset Structure\n\n- 'name': Name of the character.\n- 'description': Detailed description of the character's persona. \n- 'text': Corresponding responses in the character's unique style.\n \nThe \"text\" dataset is formatted as follows (the system message and 4-5 following conversations):\n<|system|>...</s>\\n<|user|>...</s>\\n<|assistant|>...</s>\\n<|user|>\\n<|assistant|>...</s>", "## Data Creation and Processing\n\nCharacters are created using imaginative writing of Gemini Pro, ensuring a diverse range of personas. Conversations are scripted to reflect different scenarios, emotions, and interactions.\n\n---" ]
[ 72, 56, 73, 4, 97, 62, 89, 92, 118, 50 ]
[ "passage: TAGS\n#task_categories-conversational #task_categories-text-generation #task_categories-question-answering #size_categories-1K<n<10K #language-English #license-cc-by-4.0 #roleplay #characters #region-us \n## Dataset Overview\n\nThis dataset trains conversational AI to embody a wide range of original characters, each with a unique persona. It includes fictional characters, complete with their own backgrounds, core traits, relationships, goals, and distinct speaking styles.## Dataset Details\n\n- Curated by: Hieu Minh Nguyen\n- Language(s) (NLP): Primarily English (with potential for multilingual extensions)\n- License: Creative Commons Attribution 4.0 International License\n- Version: 1.0 (The new version will be updated soon with topics included for the dataset and 10000+ more entries.)## Dataset Description### The dataset includes:\n- Name and the description of the character.\n- System messages that define each character's persona.\n- Conversational exchanges demonstrating typical reactions in various scenarios.\n- Coverage of different emotions and topics, with direct quotes and signature linguistic ticks.\n- Includes a wide array of characters, ranging from well-known fictional figures to completely original, self-created personas.#### Dataset Composition\n- Number of Rows: Over 5000 entries, each representing a unique interaction.\n- Interaction Style: Each dataset entry consists of a system message defining the character's traits, followed by 3-5 conversational exchanges between the character and a user.#### Dataset Goals and Applications\n- Training Objectives: Ideal for training AI in role-playing applications, chatbots, interactive storytelling, and creative writing tools.\n- Research Value: Useful for studies in character-driven narrative generation, conversational AI, and creative writing in AI.\n- Out-of-Scope Use: Not suited for tasks unrelated to conversational or creative AI." ]
4c13f6118c76102e4385814cbd694a23e353c224
# PLANT ORGANS Photos of various plants with their major, above ground organs labeled. Includes labels for stem, leafs, fruits and flowers. Note, that categories listed above do not necessarily correspond to a correct botanical term for the given part of the plant photographed. Instead they correspond to the conventional understanding of them. # ID - Label Map Following table describes pixel values corresponding to labels in provided masks. The first label, "void", represents the background. | Index | Label | |-------|-------| |0 | void | |1 | Fruit | |2 | Leaf | |3 | Flower | |4 | Stem |
jpodivin/plantorgans
[ "task_categories:image-segmentation", "size_categories:1K<n<10K", "language:en", "license:cdla-permissive-2.0", "biology", "doi:10.57967/hf/1606", "region:us" ]
2023-12-08T20:37:03+00:00
{"language": ["en"], "license": "cdla-permissive-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["image-segmentation"], "pretty_name": "plant organs", "tags": ["biology"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 9121146572.05, "num_examples": 5745}, {"name": "validation", "num_bytes": 2367801100.383, "num_examples": 1437}], "download_size": 11607836195, "dataset_size": 11488947672.432999}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]}
2024-01-14T14:48:07+00:00
[]
[ "en" ]
TAGS #task_categories-image-segmentation #size_categories-1K<n<10K #language-English #license-cdla-permissive-2.0 #biology #doi-10.57967/hf/1606 #region-us
PLANT ORGANS ============ Photos of various plants with their major, above ground organs labeled. Includes labels for stem, leafs, fruits and flowers. Note, that categories listed above do not necessarily correspond to a correct botanical term for the given part of the plant photographed. Instead they correspond to the conventional understanding of them. ID - Label Map ============== Following table describes pixel values corresponding to labels in provided masks. The first label, "void", represents the background.
[]
[ "TAGS\n#task_categories-image-segmentation #size_categories-1K<n<10K #language-English #license-cdla-permissive-2.0 #biology #doi-10.57967/hf/1606 #region-us \n" ]
[ 61 ]
[ "passage: TAGS\n#task_categories-image-segmentation #size_categories-1K<n<10K #language-English #license-cdla-permissive-2.0 #biology #doi-10.57967/hf/1606 #region-us \n" ]
011651d187be0651d5387ebefe798e1e950c205b
# Dataset Card for Evaluation run of Mihaiii/Pallas-0.2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Mihaiii/Pallas-0.2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Mihaiii/Pallas-0.2](https://huggingface.co/Mihaiii/Pallas-0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Mihaiii__Pallas-0.2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-10T14:40:43.951655](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Pallas-0.2/blob/main/results_2023-12-10T14-40-43.951655.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7511646602454795, "acc_stderr": 0.028726633655541643, "acc_norm": 0.755796537139697, "acc_norm_stderr": 0.029268159364361807, "mc1": 0.401468788249694, "mc1_stderr": 0.017160273901693654, "mc2": 0.5527158465542162, "mc2_stderr": 0.015710456299665783 }, "harness|arc:challenge|25": { "acc": 0.6245733788395904, "acc_stderr": 0.014150631435111728, "acc_norm": 0.6450511945392492, "acc_norm_stderr": 0.013983036904094089 }, "harness|hellaswag|10": { "acc": 0.6434973112925712, "acc_stderr": 0.004779872250633712, "acc_norm": 0.8346942840071699, "acc_norm_stderr": 0.003706970856410969 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7111111111111111, "acc_stderr": 0.03915450630414251, "acc_norm": 0.7111111111111111, "acc_norm_stderr": 0.03915450630414251 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8618421052631579, "acc_stderr": 0.028081042939576552, "acc_norm": 0.8618421052631579, "acc_norm_stderr": 0.028081042939576552 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.77, "acc_stderr": 0.042295258468165044, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165044 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8, "acc_stderr": 0.02461829819586651, "acc_norm": 0.8, "acc_norm_stderr": 0.02461829819586651 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.875, "acc_stderr": 0.02765610492929436, "acc_norm": 0.875, "acc_norm_stderr": 0.02765610492929436 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7514450867052023, "acc_stderr": 0.03295304696818317, "acc_norm": 0.7514450867052023, "acc_norm_stderr": 0.03295304696818317 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5392156862745098, "acc_stderr": 0.04959859966384181, "acc_norm": 0.5392156862745098, "acc_norm_stderr": 0.04959859966384181 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.8, "acc_stderr": 0.04020151261036845, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7617021276595745, "acc_stderr": 0.027851252973889774, "acc_norm": 0.7617021276595745, "acc_norm_stderr": 0.027851252973889774 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.6140350877192983, "acc_stderr": 0.04579639422070434, "acc_norm": 0.6140350877192983, "acc_norm_stderr": 0.04579639422070434 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7241379310344828, "acc_stderr": 0.03724563619774632, "acc_norm": 0.7241379310344828, "acc_norm_stderr": 0.03724563619774632 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.6693121693121693, "acc_stderr": 0.02422996529842509, "acc_norm": 0.6693121693121693, "acc_norm_stderr": 0.02422996529842509 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04444444444444449, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04444444444444449 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.9032258064516129, "acc_stderr": 0.016818943416345197, "acc_norm": 0.9032258064516129, "acc_norm_stderr": 0.016818943416345197 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6798029556650246, "acc_stderr": 0.032826493853041504, "acc_norm": 0.6798029556650246, "acc_norm_stderr": 0.032826493853041504 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8424242424242424, "acc_stderr": 0.028450388805284332, "acc_norm": 0.8424242424242424, "acc_norm_stderr": 0.028450388805284332 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9090909090909091, "acc_stderr": 0.020482086775424218, "acc_norm": 0.9090909090909091, "acc_norm_stderr": 0.020482086775424218 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9740932642487047, "acc_stderr": 0.01146452335695318, "acc_norm": 0.9740932642487047, "acc_norm_stderr": 0.01146452335695318 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.8153846153846154, "acc_stderr": 0.0196716324131003, "acc_norm": 0.8153846153846154, "acc_norm_stderr": 0.0196716324131003 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.42962962962962964, "acc_stderr": 0.030182099804387262, "acc_norm": 0.42962962962962964, "acc_norm_stderr": 0.030182099804387262 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8319327731092437, "acc_stderr": 0.024289102115692282, "acc_norm": 0.8319327731092437, "acc_norm_stderr": 0.024289102115692282 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.46357615894039733, "acc_stderr": 0.04071636065944215, "acc_norm": 0.46357615894039733, "acc_norm_stderr": 0.04071636065944215 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9155963302752294, "acc_stderr": 0.011918819327334872, "acc_norm": 0.9155963302752294, "acc_norm_stderr": 0.011918819327334872 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6296296296296297, "acc_stderr": 0.03293377139415191, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.03293377139415191 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9264705882352942, "acc_stderr": 0.01831885585008968, "acc_norm": 0.9264705882352942, "acc_norm_stderr": 0.01831885585008968 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9029535864978903, "acc_stderr": 0.01926932302564027, "acc_norm": 0.9029535864978903, "acc_norm_stderr": 0.01926932302564027 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7937219730941704, "acc_stderr": 0.02715715047956382, "acc_norm": 0.7937219730941704, "acc_norm_stderr": 0.02715715047956382 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8549618320610687, "acc_stderr": 0.03088466108951538, "acc_norm": 0.8549618320610687, "acc_norm_stderr": 0.03088466108951538 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8842975206611571, "acc_stderr": 0.029199802455622814, "acc_norm": 0.8842975206611571, "acc_norm_stderr": 0.029199802455622814 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8611111111111112, "acc_stderr": 0.03343270062869623, "acc_norm": 0.8611111111111112, "acc_norm_stderr": 0.03343270062869623 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8773006134969326, "acc_stderr": 0.025777328426978927, "acc_norm": 0.8773006134969326, "acc_norm_stderr": 0.025777328426978927 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5357142857142857, "acc_stderr": 0.04733667890053756, "acc_norm": 0.5357142857142857, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.8640776699029126, "acc_stderr": 0.0339329572976101, "acc_norm": 0.8640776699029126, "acc_norm_stderr": 0.0339329572976101 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9401709401709402, "acc_stderr": 0.015537514263253872, "acc_norm": 0.9401709401709402, "acc_norm_stderr": 0.015537514263253872 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8991060025542784, "acc_stderr": 0.010770472014886713, "acc_norm": 0.8991060025542784, "acc_norm_stderr": 0.010770472014886713 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8121387283236994, "acc_stderr": 0.021029269752423224, "acc_norm": 0.8121387283236994, "acc_norm_stderr": 0.021029269752423224 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.7083798882681565, "acc_stderr": 0.015201032512520418, "acc_norm": 0.7083798882681565, "acc_norm_stderr": 0.015201032512520418 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.803921568627451, "acc_stderr": 0.022733789405447603, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.022733789405447603 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7909967845659164, "acc_stderr": 0.023093140398374224, "acc_norm": 0.7909967845659164, "acc_norm_stderr": 0.023093140398374224 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8734567901234568, "acc_stderr": 0.018498600558790906, "acc_norm": 0.8734567901234568, "acc_norm_stderr": 0.018498600558790906 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6205673758865248, "acc_stderr": 0.02894733885161409, "acc_norm": 0.6205673758865248, "acc_norm_stderr": 0.02894733885161409 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5847457627118644, "acc_stderr": 0.012585471793400665, "acc_norm": 0.5847457627118644, "acc_norm_stderr": 0.012585471793400665 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8125, "acc_stderr": 0.023709788253811766, "acc_norm": 0.8125, "acc_norm_stderr": 0.023709788253811766 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.8137254901960784, "acc_stderr": 0.01575052628436335, "acc_norm": 0.8137254901960784, "acc_norm_stderr": 0.01575052628436335 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7363636363636363, "acc_stderr": 0.04220224692971987, "acc_norm": 0.7363636363636363, "acc_norm_stderr": 0.04220224692971987 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8367346938775511, "acc_stderr": 0.02366169917709861, "acc_norm": 0.8367346938775511, "acc_norm_stderr": 0.02366169917709861 }, "harness|hendrycksTest-sociology|5": { "acc": 0.9054726368159204, "acc_stderr": 0.020687186951534108, "acc_norm": 0.9054726368159204, "acc_norm_stderr": 0.020687186951534108 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.92, "acc_stderr": 0.027265992434429103, "acc_norm": 0.92, "acc_norm_stderr": 0.027265992434429103 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8654970760233918, "acc_stderr": 0.026168221344662297, "acc_norm": 0.8654970760233918, "acc_norm_stderr": 0.026168221344662297 }, "harness|truthfulqa:mc|0": { "mc1": 0.401468788249694, "mc1_stderr": 0.017160273901693654, "mc2": 0.5527158465542162, "mc2_stderr": 0.015710456299665783 }, "harness|winogrande|5": { "acc": 0.813733228097869, "acc_stderr": 0.010941877955676207 }, "harness|gsm8k|5": { "acc": 0.6269901440485216, "acc_stderr": 0.013320876609777208 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Mihaiii__Pallas-0.2
[ "region:us" ]
2023-12-08T21:21:17+00:00
{"pretty_name": "Evaluation run of Mihaiii/Pallas-0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Mihaiii/Pallas-0.2](https://huggingface.co/Mihaiii/Pallas-0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mihaiii__Pallas-0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T14:40:43.951655](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Pallas-0.2/blob/main/results_2023-12-10T14-40-43.951655.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7511646602454795,\n \"acc_stderr\": 0.028726633655541643,\n \"acc_norm\": 0.755796537139697,\n \"acc_norm_stderr\": 0.029268159364361807,\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5527158465542162,\n \"mc2_stderr\": 0.015710456299665783\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111728,\n \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094089\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6434973112925712,\n \"acc_stderr\": 0.004779872250633712,\n \"acc_norm\": 0.8346942840071699,\n \"acc_norm_stderr\": 0.003706970856410969\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8618421052631579,\n \"acc_stderr\": 0.028081042939576552,\n \"acc_norm\": 0.8618421052631579,\n \"acc_norm_stderr\": 0.028081042939576552\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02461829819586651,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02461829819586651\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.02765610492929436,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.02765610492929436\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.04959859966384181,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.04959859966384181\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7617021276595745,\n \"acc_stderr\": 0.027851252973889774,\n \"acc_norm\": 0.7617021276595745,\n \"acc_norm_stderr\": 0.027851252973889774\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.03724563619774632,\n \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.03724563619774632\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6693121693121693,\n \"acc_stderr\": 0.02422996529842509,\n \"acc_norm\": 0.6693121693121693,\n \"acc_norm_stderr\": 0.02422996529842509\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6798029556650246,\n \"acc_stderr\": 0.032826493853041504,\n \"acc_norm\": 0.6798029556650246,\n \"acc_norm_stderr\": 0.032826493853041504\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284332,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284332\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.020482086775424218,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.020482086775424218\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8153846153846154,\n \"acc_stderr\": 0.0196716324131003,\n \"acc_norm\": 0.8153846153846154,\n \"acc_norm_stderr\": 0.0196716324131003\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.030182099804387262,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.030182099804387262\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8319327731092437,\n \"acc_stderr\": 0.024289102115692282,\n \"acc_norm\": 0.8319327731092437,\n \"acc_norm_stderr\": 0.024289102115692282\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944215,\n \"acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944215\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9155963302752294,\n \"acc_stderr\": 0.011918819327334872,\n \"acc_norm\": 0.9155963302752294,\n \"acc_norm_stderr\": 0.011918819327334872\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.03293377139415191,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.03293377139415191\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.01926932302564027,\n \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.01926932302564027\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.03088466108951538,\n \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.03088466108951538\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622814,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622814\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.03343270062869623,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.03343270062869623\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.0339329572976101,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.0339329572976101\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253872,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253872\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8991060025542784,\n \"acc_stderr\": 0.010770472014886713,\n \"acc_norm\": 0.8991060025542784,\n \"acc_norm_stderr\": 0.010770472014886713\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8121387283236994,\n \"acc_stderr\": 0.021029269752423224,\n \"acc_norm\": 0.8121387283236994,\n \"acc_norm_stderr\": 0.021029269752423224\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7083798882681565,\n \"acc_stderr\": 0.015201032512520418,\n \"acc_norm\": 0.7083798882681565,\n \"acc_norm_stderr\": 0.015201032512520418\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.022733789405447603,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.022733789405447603\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7909967845659164,\n \"acc_stderr\": 0.023093140398374224,\n \"acc_norm\": 0.7909967845659164,\n \"acc_norm_stderr\": 0.023093140398374224\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8734567901234568,\n \"acc_stderr\": 0.018498600558790906,\n \"acc_norm\": 0.8734567901234568,\n \"acc_norm_stderr\": 0.018498600558790906\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6205673758865248,\n \"acc_stderr\": 0.02894733885161409,\n \"acc_norm\": 0.6205673758865248,\n \"acc_norm_stderr\": 0.02894733885161409\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5847457627118644,\n \"acc_stderr\": 0.012585471793400665,\n \"acc_norm\": 0.5847457627118644,\n \"acc_norm_stderr\": 0.012585471793400665\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.01575052628436335,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.01575052628436335\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.02366169917709861,\n \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.02366169917709861\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9054726368159204,\n \"acc_stderr\": 0.020687186951534108,\n \"acc_norm\": 0.9054726368159204,\n \"acc_norm_stderr\": 0.020687186951534108\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.027265992434429103,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.027265992434429103\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5527158465542162,\n \"mc2_stderr\": 0.015710456299665783\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.010941877955676207\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6269901440485216,\n \"acc_stderr\": 0.013320876609777208\n }\n}\n```", "repo_url": "https://huggingface.co/Mihaiii/Pallas-0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|arc:challenge|25_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|arc:challenge|25_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|gsm8k|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|gsm8k|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hellaswag|10_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hellaswag|10_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T21-18-28.052957.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T14-40-43.951655.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["**/details_harness|winogrande|5_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["**/details_harness|winogrande|5_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T14-40-43.951655.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T21_18_28.052957", "path": ["results_2023-12-08T21-18-28.052957.parquet"]}, {"split": "2023_12_10T14_40_43.951655", "path": ["results_2023-12-10T14-40-43.951655.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T14-40-43.951655.parquet"]}]}]}
2023-12-10T14:44:15+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Mihaiii/Pallas-0.2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Mihaiii/Pallas-0.2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-10T14:40:43.951655(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Mihaiii/Pallas-0.2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Mihaiii/Pallas-0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-10T14:40:43.951655(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Mihaiii/Pallas-0.2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Mihaiii/Pallas-0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-10T14:40:43.951655(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 16, 31, 165, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Mihaiii/Pallas-0.2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Mihaiii/Pallas-0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-10T14:40:43.951655(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
51781d1f312fa3081b917747dc2b0163bdc634e9
# Dataset Card for Evaluation run of migtissera/Tess-34B-v1.4 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/migtissera/Tess-34B-v1.4 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [migtissera/Tess-34B-v1.4](https://huggingface.co/migtissera/Tess-34B-v1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_migtissera__Tess-34B-v1.4", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-08T21:42:14.185157](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-34B-v1.4/blob/main/results_2023-12-08T21-42-14.185157.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.744827149985735, "acc_stderr": 0.029006208191077444, "acc_norm": 0.7498384630409163, "acc_norm_stderr": 0.02955308381117489, "mc1": 0.412484700122399, "mc1_stderr": 0.01723329939957122, "mc2": 0.5679144363917273, "mc2_stderr": 0.01582118053131118 }, "harness|arc:challenge|25": { "acc": 0.6228668941979523, "acc_stderr": 0.0141633668961926, "acc_norm": 0.6459044368600683, "acc_norm_stderr": 0.01397545412275656 }, "harness|hellaswag|10": { "acc": 0.6419040031866162, "acc_stderr": 0.004784607222774645, "acc_norm": 0.833698466440948, "acc_norm_stderr": 0.0037159010850549854 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6962962962962963, "acc_stderr": 0.03972552884785136, "acc_norm": 0.6962962962962963, "acc_norm_stderr": 0.03972552884785136 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8618421052631579, "acc_stderr": 0.028081042939576552, "acc_norm": 0.8618421052631579, "acc_norm_stderr": 0.028081042939576552 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.77, "acc_stderr": 0.042295258468165044, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165044 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8037735849056604, "acc_stderr": 0.02444238813110082, "acc_norm": 0.8037735849056604, "acc_norm_stderr": 0.02444238813110082 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.875, "acc_stderr": 0.02765610492929436, "acc_norm": 0.875, "acc_norm_stderr": 0.02765610492929436 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.45, "acc_stderr": 0.04999999999999999, "acc_norm": 0.45, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7398843930635838, "acc_stderr": 0.033450369167889904, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.033450369167889904 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5392156862745098, "acc_stderr": 0.04959859966384181, "acc_norm": 0.5392156862745098, "acc_norm_stderr": 0.04959859966384181 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.8, "acc_stderr": 0.04020151261036845, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7702127659574468, "acc_stderr": 0.027501752944412417, "acc_norm": 0.7702127659574468, "acc_norm_stderr": 0.027501752944412417 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.6228070175438597, "acc_stderr": 0.04559522141958216, "acc_norm": 0.6228070175438597, "acc_norm_stderr": 0.04559522141958216 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7379310344827587, "acc_stderr": 0.03664666337225257, "acc_norm": 0.7379310344827587, "acc_norm_stderr": 0.03664666337225257 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.6851851851851852, "acc_stderr": 0.023919984164047736, "acc_norm": 0.6851851851851852, "acc_norm_stderr": 0.023919984164047736 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5476190476190477, "acc_stderr": 0.044518079590553275, "acc_norm": 0.5476190476190477, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.55, "acc_stderr": 0.049999999999999996, "acc_norm": 0.55, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.9032258064516129, "acc_stderr": 0.016818943416345197, "acc_norm": 0.9032258064516129, "acc_norm_stderr": 0.016818943416345197 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6748768472906403, "acc_stderr": 0.032957975663112704, "acc_norm": 0.6748768472906403, "acc_norm_stderr": 0.032957975663112704 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.81, "acc_stderr": 0.03942772444036625, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036625 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8363636363636363, "acc_stderr": 0.028887872395487946, "acc_norm": 0.8363636363636363, "acc_norm_stderr": 0.028887872395487946 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9191919191919192, "acc_stderr": 0.019417681889724536, "acc_norm": 0.9191919191919192, "acc_norm_stderr": 0.019417681889724536 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9689119170984456, "acc_stderr": 0.012525310625527041, "acc_norm": 0.9689119170984456, "acc_norm_stderr": 0.012525310625527041 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7974358974358975, "acc_stderr": 0.020377660970371393, "acc_norm": 0.7974358974358975, "acc_norm_stderr": 0.020377660970371393 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.4148148148148148, "acc_stderr": 0.03003984245406929, "acc_norm": 0.4148148148148148, "acc_norm_stderr": 0.03003984245406929 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8235294117647058, "acc_stderr": 0.02476290267805793, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.02476290267805793 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.48344370860927155, "acc_stderr": 0.0408024418562897, "acc_norm": 0.48344370860927155, "acc_norm_stderr": 0.0408024418562897 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9100917431192661, "acc_stderr": 0.012264304540230446, "acc_norm": 0.9100917431192661, "acc_norm_stderr": 0.012264304540230446 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.625, "acc_stderr": 0.033016908987210894, "acc_norm": 0.625, "acc_norm_stderr": 0.033016908987210894 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9313725490196079, "acc_stderr": 0.017744453647073315, "acc_norm": 0.9313725490196079, "acc_norm_stderr": 0.017744453647073315 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8945147679324894, "acc_stderr": 0.019995560723758535, "acc_norm": 0.8945147679324894, "acc_norm_stderr": 0.019995560723758535 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7892376681614349, "acc_stderr": 0.027373095500540186, "acc_norm": 0.7892376681614349, "acc_norm_stderr": 0.027373095500540186 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8396946564885496, "acc_stderr": 0.03217829420744631, "acc_norm": 0.8396946564885496, "acc_norm_stderr": 0.03217829420744631 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8760330578512396, "acc_stderr": 0.030083098716035202, "acc_norm": 0.8760330578512396, "acc_norm_stderr": 0.030083098716035202 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8518518518518519, "acc_stderr": 0.03434300243631001, "acc_norm": 0.8518518518518519, "acc_norm_stderr": 0.03434300243631001 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8650306748466258, "acc_stderr": 0.02684576505455386, "acc_norm": 0.8650306748466258, "acc_norm_stderr": 0.02684576505455386 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5, "acc_stderr": 0.04745789978762494, "acc_norm": 0.5, "acc_norm_stderr": 0.04745789978762494 }, "harness|hendrycksTest-management|5": { "acc": 0.8446601941747572, "acc_stderr": 0.03586594738573974, "acc_norm": 0.8446601941747572, "acc_norm_stderr": 0.03586594738573974 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9230769230769231, "acc_stderr": 0.017456987872436186, "acc_norm": 0.9230769230769231, "acc_norm_stderr": 0.017456987872436186 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9042145593869731, "acc_stderr": 0.01052403107905584, "acc_norm": 0.9042145593869731, "acc_norm_stderr": 0.01052403107905584 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8208092485549133, "acc_stderr": 0.020647590029679332, "acc_norm": 0.8208092485549133, "acc_norm_stderr": 0.020647590029679332 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.6737430167597765, "acc_stderr": 0.01568044151888918, "acc_norm": 0.6737430167597765, "acc_norm_stderr": 0.01568044151888918 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7973856209150327, "acc_stderr": 0.023015446877985665, "acc_norm": 0.7973856209150327, "acc_norm_stderr": 0.023015446877985665 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7845659163987139, "acc_stderr": 0.023350225475471442, "acc_norm": 0.7845659163987139, "acc_norm_stderr": 0.023350225475471442 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8734567901234568, "acc_stderr": 0.018498600558790906, "acc_norm": 0.8734567901234568, "acc_norm_stderr": 0.018498600558790906 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6134751773049646, "acc_stderr": 0.029049190342543465, "acc_norm": 0.6134751773049646, "acc_norm_stderr": 0.029049190342543465 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5795306388526728, "acc_stderr": 0.012607654553832701, "acc_norm": 0.5795306388526728, "acc_norm_stderr": 0.012607654553832701 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8014705882352942, "acc_stderr": 0.02423101337054108, "acc_norm": 0.8014705882352942, "acc_norm_stderr": 0.02423101337054108 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.8022875816993464, "acc_stderr": 0.016112443369726732, "acc_norm": 0.8022875816993464, "acc_norm_stderr": 0.016112443369726732 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7181818181818181, "acc_stderr": 0.04309118709946458, "acc_norm": 0.7181818181818181, "acc_norm_stderr": 0.04309118709946458 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8367346938775511, "acc_stderr": 0.023661699177098615, "acc_norm": 0.8367346938775511, "acc_norm_stderr": 0.023661699177098615 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8855721393034826, "acc_stderr": 0.022509345325101706, "acc_norm": 0.8855721393034826, "acc_norm_stderr": 0.022509345325101706 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.92, "acc_stderr": 0.027265992434429103, "acc_norm": 0.92, "acc_norm_stderr": 0.027265992434429103 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8771929824561403, "acc_stderr": 0.02517298435015577, "acc_norm": 0.8771929824561403, "acc_norm_stderr": 0.02517298435015577 }, "harness|truthfulqa:mc|0": { "mc1": 0.412484700122399, "mc1_stderr": 0.01723329939957122, "mc2": 0.5679144363917273, "mc2_stderr": 0.01582118053131118 }, "harness|winogrande|5": { "acc": 0.8121546961325967, "acc_stderr": 0.010977481103435091 }, "harness|gsm8k|5": { "acc": 0.5966641394996209, "acc_stderr": 0.013512654781814706 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_migtissera__Tess-34B-v1.4
[ "region:us" ]
2023-12-08T21:45:04+00:00
{"pretty_name": "Evaluation run of migtissera/Tess-34B-v1.4", "dataset_summary": "Dataset automatically created during the evaluation run of model [migtissera/Tess-34B-v1.4](https://huggingface.co/migtissera/Tess-34B-v1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Tess-34B-v1.4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T21:42:14.185157](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-34B-v1.4/blob/main/results_2023-12-08T21-42-14.185157.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.744827149985735,\n \"acc_stderr\": 0.029006208191077444,\n \"acc_norm\": 0.7498384630409163,\n \"acc_norm_stderr\": 0.02955308381117489,\n \"mc1\": 0.412484700122399,\n \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5679144363917273,\n \"mc2_stderr\": 0.01582118053131118\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6228668941979523,\n \"acc_stderr\": 0.0141633668961926,\n \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.01397545412275656\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6419040031866162,\n \"acc_stderr\": 0.004784607222774645,\n \"acc_norm\": 0.833698466440948,\n \"acc_norm_stderr\": 0.0037159010850549854\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6962962962962963,\n \"acc_stderr\": 0.03972552884785136,\n \"acc_norm\": 0.6962962962962963,\n \"acc_norm_stderr\": 0.03972552884785136\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8618421052631579,\n \"acc_stderr\": 0.028081042939576552,\n \"acc_norm\": 0.8618421052631579,\n \"acc_norm_stderr\": 0.028081042939576552\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8037735849056604,\n \"acc_stderr\": 0.02444238813110082,\n \"acc_norm\": 0.8037735849056604,\n \"acc_norm_stderr\": 0.02444238813110082\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.02765610492929436,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.02765610492929436\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.033450369167889904,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.033450369167889904\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.04959859966384181,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.04959859966384181\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7702127659574468,\n \"acc_stderr\": 0.027501752944412417,\n \"acc_norm\": 0.7702127659574468,\n \"acc_norm_stderr\": 0.027501752944412417\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6228070175438597,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.6228070175438597,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7379310344827587,\n \"acc_stderr\": 0.03664666337225257,\n \"acc_norm\": 0.7379310344827587,\n \"acc_norm_stderr\": 0.03664666337225257\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.023919984164047736,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.023919984164047736\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6748768472906403,\n \"acc_stderr\": 0.032957975663112704,\n \"acc_norm\": 0.6748768472906403,\n \"acc_norm_stderr\": 0.032957975663112704\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.028887872395487946,\n \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.028887872395487946\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527041,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527041\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7974358974358975,\n \"acc_stderr\": 0.020377660970371393,\n \"acc_norm\": 0.7974358974358975,\n \"acc_norm_stderr\": 0.020377660970371393\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4148148148148148,\n \"acc_stderr\": 0.03003984245406929,\n \"acc_norm\": 0.4148148148148148,\n \"acc_norm_stderr\": 0.03003984245406929\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02476290267805793,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02476290267805793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.48344370860927155,\n \"acc_stderr\": 0.0408024418562897,\n \"acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.0408024418562897\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9100917431192661,\n \"acc_stderr\": 0.012264304540230446,\n \"acc_norm\": 0.9100917431192661,\n \"acc_norm_stderr\": 0.012264304540230446\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073315,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073315\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8945147679324894,\n \"acc_stderr\": 0.019995560723758535,\n \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.019995560723758535\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n \"acc_stderr\": 0.027373095500540186,\n \"acc_norm\": 0.7892376681614349,\n \"acc_norm_stderr\": 0.027373095500540186\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744631,\n \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744631\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243631001,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243631001\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.02684576505455386,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.02684576505455386\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n \"acc_stderr\": 0.017456987872436186,\n \"acc_norm\": 0.9230769230769231,\n \"acc_norm_stderr\": 0.017456987872436186\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9042145593869731,\n \"acc_stderr\": 0.01052403107905584,\n \"acc_norm\": 0.9042145593869731,\n \"acc_norm_stderr\": 0.01052403107905584\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6737430167597765,\n \"acc_stderr\": 0.01568044151888918,\n \"acc_norm\": 0.6737430167597765,\n \"acc_norm_stderr\": 0.01568044151888918\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7973856209150327,\n \"acc_stderr\": 0.023015446877985665,\n \"acc_norm\": 0.7973856209150327,\n \"acc_norm_stderr\": 0.023015446877985665\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8734567901234568,\n \"acc_stderr\": 0.018498600558790906,\n \"acc_norm\": 0.8734567901234568,\n \"acc_norm_stderr\": 0.018498600558790906\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6134751773049646,\n \"acc_stderr\": 0.029049190342543465,\n \"acc_norm\": 0.6134751773049646,\n \"acc_norm_stderr\": 0.029049190342543465\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5795306388526728,\n \"acc_stderr\": 0.012607654553832701,\n \"acc_norm\": 0.5795306388526728,\n \"acc_norm_stderr\": 0.012607654553832701\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8014705882352942,\n \"acc_stderr\": 0.02423101337054108,\n \"acc_norm\": 0.8014705882352942,\n \"acc_norm_stderr\": 0.02423101337054108\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8022875816993464,\n \"acc_stderr\": 0.016112443369726732,\n \"acc_norm\": 0.8022875816993464,\n \"acc_norm_stderr\": 0.016112443369726732\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.023661699177098615,\n \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.023661699177098615\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.027265992434429103,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.027265992434429103\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.412484700122399,\n \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5679144363917273,\n \"mc2_stderr\": 0.01582118053131118\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435091\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5966641394996209,\n \"acc_stderr\": 0.013512654781814706\n }\n}\n```", "repo_url": "https://huggingface.co/migtissera/Tess-34B-v1.4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|arc:challenge|25_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|gsm8k|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hellaswag|10_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T21-42-14.185157.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["**/details_harness|winogrande|5_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T21-42-14.185157.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T21_42_14.185157", "path": ["results_2023-12-08T21-42-14.185157.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T21-42-14.185157.parquet"]}]}]}
2023-12-08T21:45:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of migtissera/Tess-34B-v1.4 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model migtissera/Tess-34B-v1.4 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-08T21:42:14.185157(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of migtissera/Tess-34B-v1.4", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Tess-34B-v1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T21:42:14.185157(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of migtissera/Tess-34B-v1.4", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Tess-34B-v1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-08T21:42:14.185157(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 171, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of migtissera/Tess-34B-v1.4## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Tess-34B-v1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-08T21:42:14.185157(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1f22ea7379c2c2c0f8a93c24053f91df0c4a74fa
# Dataset Card for "flan_plus_cot_100k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
monology/flan_plus_cot_100k
[ "region:us" ]
2023-12-08T23:36:04+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "dataset", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "messages", "list": [{"name": "role", "dtype": "string"}, {"name": "content", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 375678160.6229266, "num_examples": 98870}], "download_size": 88030549, "dataset_size": 375678160.6229266}}
2023-12-08T23:36:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for "flan_plus_cot_100k" More Information needed
[ "# Dataset Card for \"flan_plus_cot_100k\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"flan_plus_cot_100k\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"flan_plus_cot_100k\"\n\nMore Information needed" ]
ee1e0f7a913f43aa3dfe50ab4dab44ba782a7104
# Dataset Card for "ultrachat_10k_nl" A translated version of 10k randomly selected examples from [HuggingFaceH4/ultrachat_200k](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k). Automatically translated by GPT-3.5. ## More info Read more about GEITje-chat, the datasets and the translation code in the [📄 README](https://github.com/Rijgersberg/GEITje/blob/main/README-en.md) on GitHub.
Rijgersberg/ultrachat_10k_nl
[ "task_categories:conversational", "task_categories:text-generation", "size_categories:10K<n<100K", "language:nl", "language:en", "license:cc-by-nc-4.0", "GEITje", "region:us" ]
2023-12-09T01:05:26+00:00
{"language": ["nl", "en"], "license": "cc-by-nc-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["conversational", "text-generation"], "pretty_name": "Ultrachat 10k NL", "configs": [{"config_name": "default", "data_files": [{"split": "test_sft", "path": "data/test_sft-*"}, {"split": "train_sft", "path": "data/train_sft-*"}]}], "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "messages_nl", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "test_sft", "num_bytes": 6296981, "num_examples": 500}, {"name": "train_sft", "num_bytes": 120475850, "num_examples": 9500}], "download_size": 65516955, "dataset_size": 126772831}, "tags": ["GEITje"]}
2023-12-15T15:26:58+00:00
[]
[ "nl", "en" ]
TAGS #task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-Dutch #language-English #license-cc-by-nc-4.0 #GEITje #region-us
# Dataset Card for "ultrachat_10k_nl" A translated version of 10k randomly selected examples from HuggingFaceH4/ultrachat_200k. Automatically translated by GPT-3.5. ## More info Read more about GEITje-chat, the datasets and the translation code in the README on GitHub.
[ "# Dataset Card for \"ultrachat_10k_nl\"\n\nA translated version of 10k randomly selected examples from HuggingFaceH4/ultrachat_200k.\nAutomatically translated by GPT-3.5.", "## More info\nRead more about GEITje-chat, the datasets and the translation code in the README on GitHub." ]
[ "TAGS\n#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-Dutch #language-English #license-cc-by-nc-4.0 #GEITje #region-us \n", "# Dataset Card for \"ultrachat_10k_nl\"\n\nA translated version of 10k randomly selected examples from HuggingFaceH4/ultrachat_200k.\nAutomatically translated by GPT-3.5.", "## More info\nRead more about GEITje-chat, the datasets and the translation code in the README on GitHub." ]
[ 64, 53, 29 ]
[ "passage: TAGS\n#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-Dutch #language-English #license-cc-by-nc-4.0 #GEITje #region-us \n# Dataset Card for \"ultrachat_10k_nl\"\n\nA translated version of 10k randomly selected examples from HuggingFaceH4/ultrachat_200k.\nAutomatically translated by GPT-3.5.## More info\nRead more about GEITje-chat, the datasets and the translation code in the README on GitHub." ]
c7a1cd57402b25ad1cfce339e74eeaff832b9f2e
# Dataset Card for "pile_dedupe_train" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
xwjiang2010/pile_dedupe_train
[ "region:us" ]
2023-12-09T03:31:06+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 88191974579, "num_examples": 15000000}], "download_size": 20794320583, "dataset_size": 88191974579}}
2023-12-09T16:56:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for "pile_dedupe_train" More Information needed
[ "# Dataset Card for \"pile_dedupe_train\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"pile_dedupe_train\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"pile_dedupe_train\"\n\nMore Information needed" ]
054a51d1baf0fc1419d0848fba1a7d3a07df5c4f
# Dataset Card for "no_robots_nl" A translated version of all 10k examples from [HuggingFaceH4/no_robots](https://huggingface.co/datasets/HuggingFaceH4/no_robots). Automatically translated by GPT-3.5. ## More info Read more about GEITje-chat, the datasets and the translation code in the [📄 README](https://github.com/Rijgersberg/GEITje/blob/main/README-en.md) on GitHub.
Rijgersberg/no_robots_nl
[ "task_categories:conversational", "task_categories:text-generation", "size_categories:10K<n<100K", "language:nl", "language:en", "license:cc-by-nc-4.0", "GEITje", "region:us" ]
2023-12-09T03:54:11+00:00
{"language": ["nl", "en"], "license": "cc-by-nc-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["conversational", "text-generation"], "pretty_name": "No Robots NL", "configs": [{"config_name": "default", "data_files": [{"split": "test_sft", "path": "data/test_sft-*"}, {"split": "train_sft", "path": "data/train_sft-*"}]}], "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "category", "dtype": "string"}, {"name": "messages_nl", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "test_sft", "num_bytes": 1517416, "num_examples": 500}, {"name": "train_sft", "num_bytes": 28407005, "num_examples": 9500}], "download_size": 18675565, "dataset_size": 29924421}, "tags": ["GEITje"]}
2023-12-15T15:24:43+00:00
[]
[ "nl", "en" ]
TAGS #task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-Dutch #language-English #license-cc-by-nc-4.0 #GEITje #region-us
# Dataset Card for "no_robots_nl" A translated version of all 10k examples from HuggingFaceH4/no_robots. Automatically translated by GPT-3.5. ## More info Read more about GEITje-chat, the datasets and the translation code in the README on GitHub.
[ "# Dataset Card for \"no_robots_nl\"\n\nA translated version of all 10k examples from HuggingFaceH4/no_robots.\nAutomatically translated by GPT-3.5.", "## More info\nRead more about GEITje-chat, the datasets and the translation code in the README on GitHub." ]
[ "TAGS\n#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-Dutch #language-English #license-cc-by-nc-4.0 #GEITje #region-us \n", "# Dataset Card for \"no_robots_nl\"\n\nA translated version of all 10k examples from HuggingFaceH4/no_robots.\nAutomatically translated by GPT-3.5.", "## More info\nRead more about GEITje-chat, the datasets and the translation code in the README on GitHub." ]
[ 64, 47, 29 ]
[ "passage: TAGS\n#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-Dutch #language-English #license-cc-by-nc-4.0 #GEITje #region-us \n# Dataset Card for \"no_robots_nl\"\n\nA translated version of all 10k examples from HuggingFaceH4/no_robots.\nAutomatically translated by GPT-3.5.## More info\nRead more about GEITje-chat, the datasets and the translation code in the README on GitHub." ]
a8d39b70f84a8fd8ae5d365582fbacedf53db9c3
# Dataset Card for "librispeech_asr-audiodec_audiodec_24k_320d" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
anthony-wss/librispeech_asr-audiodec_audiodec_24k_320d
[ "region:us" ]
2023-12-09T04:57:56+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train.clean.360", "path": "data/train.clean.360-*"}, {"split": "train.other.500", "path": "data/train.other.500-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "unit", "sequence": {"sequence": "int64"}}], "splits": [{"name": "train.clean.360", "num_bytes": 4496389172, "num_examples": 104014}, {"name": "train.other.500", "num_bytes": 6144041489, "num_examples": 148688}], "download_size": 2186014174, "dataset_size": 10640430661}}
2023-12-09T05:03:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for "librispeech_asr-audiodec_audiodec_24k_320d" More Information needed
[ "# Dataset Card for \"librispeech_asr-audiodec_audiodec_24k_320d\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"librispeech_asr-audiodec_audiodec_24k_320d\"\n\nMore Information needed" ]
[ 6, 30 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"librispeech_asr-audiodec_audiodec_24k_320d\"\n\nMore Information needed" ]
74b3e5a398670c098cc4b2596b9fa3de9f19cfeb
## Data Description This is a pre-processed version of the [Nectar](https://huggingface.co/datasets/berkeley-nest/Nectar) dataset and was processed like [ultrafeedback_binarized](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized) which was used to train Zephyr-7Β-β, a state of the art chat model at the 7B parameter scale. This dataset can be easily used with [alignment-handbook](https://github.com/huggingface/alignment-handbook/tree/main) to do **DPO** process for your models using [Nectar](https://huggingface.co/datasets/berkeley-nest/Nectar) dataset. The original [Nectar](https://huggingface.co/datasets/berkeley-nest/Nectar) dataset consists of 183k prompts, along with high-quality and diverse responses, and accurate ranking labels. We use the rank1 response as "chosen" while random select 1 response from rank2~7 as "rejected". ## Citation If you find this dataset is useful in your work, please cite the original Nectar dataset: https://huggingface.co/datasets/berkeley-nest/Nectar You may also wish to cite our repo: <pre><code>@misc{gao2023nectarb, title = {Nectar_binarized}, url = {https://huggingface.co/datasets/HongchengGao/Nectar_binarized/blob/main/README.md}, author = {Hongcheng Gao}, month = {December}, year = {2023} } </code></pre>
HongchengGao/Nectar_binarized
[ "region:us" ]
2023-12-09T05:33:35+00:00
{}
2023-12-09T10:14:42+00:00
[]
[]
TAGS #region-us
## Data Description This is a pre-processed version of the Nectar dataset and was processed like ultrafeedback_binarized which was used to train Zephyr-7Β-β, a state of the art chat model at the 7B parameter scale. This dataset can be easily used with alignment-handbook to do DPO process for your models using Nectar dataset. The original Nectar dataset consists of 183k prompts, along with high-quality and diverse responses, and accurate ranking labels. We use the rank1 response as "chosen" while random select 1 response from rank2~7 as "rejected". If you find this dataset is useful in your work, please cite the original Nectar dataset: URL You may also wish to cite our repo: <pre><code>@misc{gao2023nectarb, title = {Nectar_binarized}, url = {URL author = {Hongcheng Gao}, month = {December}, year = {2023} } </code></pre>
[ "## Data Description\nThis is a pre-processed version of the Nectar dataset and was processed like ultrafeedback_binarized which was used to train Zephyr-7Β-β, a state of the art chat model at the 7B parameter scale.\n\nThis dataset can be easily used with alignment-handbook to do DPO process for your models using Nectar dataset.\n\nThe original Nectar dataset consists of 183k prompts, along with high-quality and diverse responses, and accurate ranking labels. We use the rank1 response as \"chosen\" while random select 1 response from rank2~7 as \"rejected\".\n\nIf you find this dataset is useful in your work, please cite the original Nectar dataset: \nURL\n\nYou may also wish to cite our repo:\n<pre><code>@misc{gao2023nectarb,\n title = {Nectar_binarized},\n url = {URL\n author = {Hongcheng Gao},\n month = {December},\n year = {2023}\n}\n</code></pre>" ]
[ "TAGS\n#region-us \n", "## Data Description\nThis is a pre-processed version of the Nectar dataset and was processed like ultrafeedback_binarized which was used to train Zephyr-7Β-β, a state of the art chat model at the 7B parameter scale.\n\nThis dataset can be easily used with alignment-handbook to do DPO process for your models using Nectar dataset.\n\nThe original Nectar dataset consists of 183k prompts, along with high-quality and diverse responses, and accurate ranking labels. We use the rank1 response as \"chosen\" while random select 1 response from rank2~7 as \"rejected\".\n\nIf you find this dataset is useful in your work, please cite the original Nectar dataset: \nURL\n\nYou may also wish to cite our repo:\n<pre><code>@misc{gao2023nectarb,\n title = {Nectar_binarized},\n url = {URL\n author = {Hongcheng Gao},\n month = {December},\n year = {2023}\n}\n</code></pre>" ]
[ 6, 236 ]
[ "passage: TAGS\n#region-us \n## Data Description\nThis is a pre-processed version of the Nectar dataset and was processed like ultrafeedback_binarized which was used to train Zephyr-7Β-β, a state of the art chat model at the 7B parameter scale.\n\nThis dataset can be easily used with alignment-handbook to do DPO process for your models using Nectar dataset.\n\nThe original Nectar dataset consists of 183k prompts, along with high-quality and diverse responses, and accurate ranking labels. We use the rank1 response as \"chosen\" while random select 1 response from rank2~7 as \"rejected\".\n\nIf you find this dataset is useful in your work, please cite the original Nectar dataset: \nURL\n\nYou may also wish to cite our repo:\n<pre><code>@misc{gao2023nectarb,\n title = {Nectar_binarized},\n url = {URL\n author = {Hongcheng Gao},\n month = {December},\n year = {2023}\n}\n</code></pre>" ]
4488151fd6aacac74cca9c40d9a511694bedac74
# Multilingual Speech 20000h This dataset contains data crawled from internet and resampled to 44.1k mp3 files.
fish-audio-private/playerfm-20000h
[ "size_categories:1M<n<10M", "license:cc-by-nc-sa-4.0", "region:us" ]
2023-12-09T07:35:27+00:00
{"license": "cc-by-nc-sa-4.0", "size_categories": ["1M<n<10M"]}
2023-12-10T20:37:41+00:00
[]
[]
TAGS #size_categories-1M<n<10M #license-cc-by-nc-sa-4.0 #region-us
# Multilingual Speech 20000h This dataset contains data crawled from internet and resampled to 44.1k mp3 files.
[ "# Multilingual Speech 20000h\n\nThis dataset contains data crawled from internet and resampled to 44.1k mp3 files." ]
[ "TAGS\n#size_categories-1M<n<10M #license-cc-by-nc-sa-4.0 #region-us \n", "# Multilingual Speech 20000h\n\nThis dataset contains data crawled from internet and resampled to 44.1k mp3 files." ]
[ 31, 30 ]
[ "passage: TAGS\n#size_categories-1M<n<10M #license-cc-by-nc-sa-4.0 #region-us \n# Multilingual Speech 20000h\n\nThis dataset contains data crawled from internet and resampled to 44.1k mp3 files." ]
62f66ec0671bf3ee6ddfa004a4fa191eac1ed016
# Dataset Card for "librispeech_asr-audiodec_academicodec_hifi_24k_320d" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
anthony-wss/librispeech_asr-audiodec_academicodec_hifi_24k_320d
[ "region:us" ]
2023-12-09T08:04:53+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train.clean.360", "path": "data/train.clean.360-*"}, {"split": "train.other.500", "path": "data/train.other.500-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "unit", "sequence": {"sequence": "int64"}}], "splits": [{"name": "train.clean.360", "num_bytes": 2116756660, "num_examples": 104014}, {"name": "train.other.500", "num_bytes": 2891949041, "num_examples": 148688}], "download_size": 804477736, "dataset_size": 5008705701}}
2023-12-09T08:07:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for "librispeech_asr-audiodec_academicodec_hifi_24k_320d" More Information needed
[ "# Dataset Card for \"librispeech_asr-audiodec_academicodec_hifi_24k_320d\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"librispeech_asr-audiodec_academicodec_hifi_24k_320d\"\n\nMore Information needed" ]
[ 6, 34 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"librispeech_asr-audiodec_academicodec_hifi_24k_320d\"\n\nMore Information needed" ]
1a56012ed624f5c199e87390bcc3c7780f38b725
# AV-Deepfake1M This is the official repository for the paper [AV-Deepfake1M: A Large-Scale LLM-Driven Audio-Visual Deepfake Dataset](http://arxiv.org/abs/2311.15308). ## Abstract The detection and localization of highly realistic deepfake audio-visual content are challenging even for the most advanced state-of-the-art methods. While most of the research efforts in this domain are focused on detecting high-quality deepfake images and videos, only a few works address the problem of the localization of small segments of audio-visual manipulations embedded in real videos. In this research, we emulate the process of such content generation and propose the AV-Deepfake1M dataset. The dataset contains content-driven (i) video manipulations, (ii) audio manipulations, and (iii) audio-visual manipulations for more than 2K subjects resulting in a total of more than 1M videos. The paper provides a thorough description of the proposed data generation pipeline accompanied by a rigorous analysis of the quality of the generated data. The comprehensive benchmark of the proposed dataset utilizing state-of-the-art deepfake detection and localization methods indicates a significant drop in performance compared to previous datasets. The proposed dataset will play a vital role in building the next-generation deepfake localization methods. ## Dataset ### Download To use this AV-Deepfake1M dataset, you should agree the [terms and conditions](TERMS_AND_CONDITIONS.md) and the [CC BY-NC 4.0 license](LICENSE). Extract the multi-volume archive with `7z`. ```bash sudo apt install p7zip-rar # Install 7z if you don't have it. 7z x train.zip.001 # Then all the volumes will be extracted. ``` ### Baseline Benchmark | Method | [email protected] | [email protected] | [email protected] | [email protected] | AR@50 | AR@20 | AR@10 | AR@5 | |----------------------------|--------|---------|--------|---------|-------|-------|-------|-------| | PyAnnote | 00.03 | 00.00 | 00.00 | 00.00 | 00.67 | 00.67 | 00.67 | 00.67 | | Meso4 | 09.86 | 06.05 | 02.22 | 00.59 | 38.92 | 38.81 | 36.47 | 26.91 | | MesoInception4 | 08.50 | 05.16 | 01.89 | 00.50 | 39.27 | 39.00 | 35.78 | 24.59 | | EfficientViT | 14.71 | 02.42 | 00.13 | 00.01 | 27.04 | 26.43 | 23.90 | 20.31 | | TriDet + VideoMAEv2 | 21.67 | 05.83 | 00.54 | 00.06 | 20.27 | 20.12 | 19.50 | 18.18 | | TriDet + InternVideo | 29.66 | 09.02 | 00.79 | 00.09 | 24.08 | 23.96 | 23.50 | 22.55 | | ActionFormer + VideoMAEv2 | 20.24 | 05.73 | 00.57 | 00.07 | 19.97 | 19.81 | 19.11 | 17.80 | | ActionFormer + InternVideo | 36.08 | 12.01 | 01.23 | 00.16 | 27.11 | 27.00 | 26.60 | 25.80 | | BA-TFD | 37.37 | 06.34 | 00.19 | 00.02 | 45.55 | 35.95 | 30.66 | 26.82 | | BA-TFD+ | 44.42 | 13.64 | 00.48 | 00.03 | 48.86 | 40.37 | 34.67 | 29.88 | | UMMAFormer | 51.64 | 28.07 | 07.65 | 01.58 | 44.07 | 43.45 | 42.09 | 40.27 | ## License This project is under the CC BY-NC 4.0 license. See [LICENSE](LICENSE) for details. ## References If you find this work useful in your research, please cite it. ```bibtex @article{cai2023avdeepfake1m, title = {AV-Deepfake1M: A Large-Scale LLM-Driven Audio-Visual Deepfake Dataset}, action = {Cai, Zhixi and Ghosh, Shreya and Adatia, Aman Pankaj and Hayat, Munawar and Dhall, Abhinav and Stefanov, Kalin}, journal = {arXiv preprint arXiv:2311.15308}, year = {2023}, } ```
ControlNet/AV-Deepfake1M
[ "task_categories:video-classification", "size_categories:1M<n<10M", "language:en", "license:cc", "deepfakes", "video", "arxiv:2311.15308", "region:us" ]
2023-12-09T08:06:53+00:00
{"language": ["en"], "license": "cc", "size_categories": ["1M<n<10M"], "task_categories": ["video-classification"], "pretty_name": "AV-Deepfake1M", "tags": ["deepfakes", "video"], "extra_gated_heading": "Access AV-Deepfake1M dataset on Hugging Face", "extra_gated_prompt": "To use this AV-Deepfake1M dataset, you should agree the [terms and conditions](https://github.com/ControlNet/AV-Deepfake1M/blob/master/TERMS_AND_CONDITIONS.md) and the [CC BY-NC 4.0 license](https://github.com/ControlNet/AV-Deepfake1M/blob/master/LICENSE)."}
2023-12-15T00:05:17+00:00
[ "2311.15308" ]
[ "en" ]
TAGS #task_categories-video-classification #size_categories-1M<n<10M #language-English #license-cc #deepfakes #video #arxiv-2311.15308 #region-us
AV-Deepfake1M ============= This is the official repository for the paper AV-Deepfake1M: A Large-Scale LLM-Driven Audio-Visual Deepfake Dataset. Abstract -------- The detection and localization of highly realistic deepfake audio-visual content are challenging even for the most advanced state-of-the-art methods. While most of the research efforts in this domain are focused on detecting high-quality deepfake images and videos, only a few works address the problem of the localization of small segments of audio-visual manipulations embedded in real videos. In this research, we emulate the process of such content generation and propose the AV-Deepfake1M dataset. The dataset contains content-driven (i) video manipulations, (ii) audio manipulations, and (iii) audio-visual manipulations for more than 2K subjects resulting in a total of more than 1M videos. The paper provides a thorough description of the proposed data generation pipeline accompanied by a rigorous analysis of the quality of the generated data. The comprehensive benchmark of the proposed dataset utilizing state-of-the-art deepfake detection and localization methods indicates a significant drop in performance compared to previous datasets. The proposed dataset will play a vital role in building the next-generation deepfake localization methods. Dataset ------- ### Download To use this AV-Deepfake1M dataset, you should agree the terms and conditions and the CC BY-NC 4.0 license. Extract the multi-volume archive with '7z'. ### Baseline Benchmark License ------- This project is under the CC BY-NC 4.0 license. See LICENSE for details. References ---------- If you find this work useful in your research, please cite it.
[ "### Download\n\n\nTo use this AV-Deepfake1M dataset, you should agree the terms and conditions and the CC BY-NC 4.0 license.\n\n\nExtract the multi-volume archive with '7z'.", "### Baseline Benchmark\n\n\n\nLicense\n-------\n\n\nThis project is under the CC BY-NC 4.0 license. See LICENSE for details.\n\n\nReferences\n----------\n\n\nIf you find this work useful in your research, please cite it." ]
[ "TAGS\n#task_categories-video-classification #size_categories-1M<n<10M #language-English #license-cc #deepfakes #video #arxiv-2311.15308 #region-us \n", "### Download\n\n\nTo use this AV-Deepfake1M dataset, you should agree the terms and conditions and the CC BY-NC 4.0 license.\n\n\nExtract the multi-volume archive with '7z'.", "### Baseline Benchmark\n\n\n\nLicense\n-------\n\n\nThis project is under the CC BY-NC 4.0 license. See LICENSE for details.\n\n\nReferences\n----------\n\n\nIf you find this work useful in your research, please cite it." ]
[ 53, 48, 47 ]
[ "passage: TAGS\n#task_categories-video-classification #size_categories-1M<n<10M #language-English #license-cc #deepfakes #video #arxiv-2311.15308 #region-us \n### Download\n\n\nTo use this AV-Deepfake1M dataset, you should agree the terms and conditions and the CC BY-NC 4.0 license.\n\n\nExtract the multi-volume archive with '7z'.### Baseline Benchmark\n\n\n\nLicense\n-------\n\n\nThis project is under the CC BY-NC 4.0 license. See LICENSE for details.\n\n\nReferences\n----------\n\n\nIf you find this work useful in your research, please cite it." ]
fca2755c4d56e3e2a47b88f2c97d3cd8f725a21a
A slightly modified version of the parsing and chunking method for [singletongue/wikipedia-utils](https://huggingface.co/datasets/singletongue/wikipedia-utils). Pre-processing was performed using [oshizo/wikipedia-utils](https://github.com/oshizo/wikipedia-utils), which is a fork of the original repository, [singletongue/wikipedia-utils](https://github.com/singletongue/wikipedia-utils). The Wikipedia data was crawled between 2023/12/5 and 2023/12/8.
oshizo/japanese-wikipedia-paragraphs
[ "language:ja", "license:cc-by-sa-4.0", "region:us" ]
2023-12-09T11:14:53+00:00
{"language": ["ja"], "license": "cc-by-sa-4.0", "dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "pageid", "dtype": "int64"}, {"name": "revid", "dtype": "int64"}, {"name": "title", "dtype": "string"}, {"name": "section", "struct": [{"name": "dt", "dtype": "string"}, {"name": "h2", "dtype": "string"}, {"name": "h3", "dtype": "string"}, {"name": "h4", "dtype": "string"}]}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7388520171, "num_examples": 10473325}], "download_size": 3987399592, "dataset_size": 7388520171}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-09T14:09:30+00:00
[]
[ "ja" ]
TAGS #language-Japanese #license-cc-by-sa-4.0 #region-us
A slightly modified version of the parsing and chunking method for singletongue/wikipedia-utils. Pre-processing was performed using oshizo/wikipedia-utils, which is a fork of the original repository, singletongue/wikipedia-utils. The Wikipedia data was crawled between 2023/12/5 and 2023/12/8.
[]
[ "TAGS\n#language-Japanese #license-cc-by-sa-4.0 #region-us \n" ]
[ 23 ]
[ "passage: TAGS\n#language-Japanese #license-cc-by-sa-4.0 #region-us \n" ]
09e9af2ae58cfa82a2ba652c3e56781ed8bfdfd3
# Dataset Card for Evaluation run of fblgit/una-xaberius-34b-v1beta ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/fblgit/una-xaberius-34b-v1beta - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [fblgit/una-xaberius-34b-v1beta](https://huggingface.co/fblgit/una-xaberius-34b-v1beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_fblgit__una-xaberius-34b-v1beta", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T11:16:37.904970](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__una-xaberius-34b-v1beta/blob/main/results_2023-12-09T11-16-37.904970.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7767755521790687, "acc_stderr": 0.027620532165746333, "acc_norm": 0.7816657153803744, "acc_norm_stderr": 0.02813443103457644, "mc1": 0.4602203182374541, "mc1_stderr": 0.01744801722396088, "mc2": 0.6144919168362304, "mc2_stderr": 0.015159547860602553 }, "harness|arc:challenge|25": { "acc": 0.6791808873720137, "acc_stderr": 0.013640943091946524, "acc_norm": 0.7039249146757679, "acc_norm_stderr": 0.013340916085246254 }, "harness|hellaswag|10": { "acc": 0.6743676558454491, "acc_stderr": 0.004676529200753001, "acc_norm": 0.8676558454491137, "acc_norm_stderr": 0.0033817200071652002 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7481481481481481, "acc_stderr": 0.03749850709174021, "acc_norm": 0.7481481481481481, "acc_norm_stderr": 0.03749850709174021 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8947368421052632, "acc_stderr": 0.024974533450920693, "acc_norm": 0.8947368421052632, "acc_norm_stderr": 0.024974533450920693 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.81, "acc_stderr": 0.03942772444036623, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036623 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8113207547169812, "acc_stderr": 0.02407999513006224, "acc_norm": 0.8113207547169812, "acc_norm_stderr": 0.02407999513006224 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8958333333333334, "acc_stderr": 0.025545239210256917, "acc_norm": 0.8958333333333334, "acc_norm_stderr": 0.025545239210256917 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.67, "acc_stderr": 0.047258156262526066, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526066 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7514450867052023, "acc_stderr": 0.03295304696818318, "acc_norm": 0.7514450867052023, "acc_norm_stderr": 0.03295304696818318 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5882352941176471, "acc_stderr": 0.04897104952726366, "acc_norm": 0.5882352941176471, "acc_norm_stderr": 0.04897104952726366 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.81, "acc_stderr": 0.03942772444036624, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036624 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7787234042553192, "acc_stderr": 0.02713634960242406, "acc_norm": 0.7787234042553192, "acc_norm_stderr": 0.02713634960242406 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.631578947368421, "acc_stderr": 0.04537815354939391, "acc_norm": 0.631578947368421, "acc_norm_stderr": 0.04537815354939391 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7931034482758621, "acc_stderr": 0.03375672449560554, "acc_norm": 0.7931034482758621, "acc_norm_stderr": 0.03375672449560554 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.7619047619047619, "acc_stderr": 0.02193587808118476, "acc_norm": 0.7619047619047619, "acc_norm_stderr": 0.02193587808118476 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5396825396825397, "acc_stderr": 0.04458029125470973, "acc_norm": 0.5396825396825397, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.9161290322580645, "acc_stderr": 0.015769027496775667, "acc_norm": 0.9161290322580645, "acc_norm_stderr": 0.015769027496775667 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6551724137931034, "acc_stderr": 0.03344283744280458, "acc_norm": 0.6551724137931034, "acc_norm_stderr": 0.03344283744280458 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.87, "acc_stderr": 0.03379976689896309, "acc_norm": 0.87, "acc_norm_stderr": 0.03379976689896309 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8848484848484849, "acc_stderr": 0.024925699798115344, "acc_norm": 0.8848484848484849, "acc_norm_stderr": 0.024925699798115344 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9292929292929293, "acc_stderr": 0.0182631054201995, "acc_norm": 0.9292929292929293, "acc_norm_stderr": 0.0182631054201995 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9792746113989638, "acc_stderr": 0.010281417011909036, "acc_norm": 0.9792746113989638, "acc_norm_stderr": 0.010281417011909036 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.841025641025641, "acc_stderr": 0.01853930114094035, "acc_norm": 0.841025641025641, "acc_norm_stderr": 0.01853930114094035 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.4666666666666667, "acc_stderr": 0.030417716961717474, "acc_norm": 0.4666666666666667, "acc_norm_stderr": 0.030417716961717474 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8865546218487395, "acc_stderr": 0.02060022575020482, "acc_norm": 0.8865546218487395, "acc_norm_stderr": 0.02060022575020482 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5761589403973509, "acc_stderr": 0.04034846678603397, "acc_norm": 0.5761589403973509, "acc_norm_stderr": 0.04034846678603397 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9321100917431193, "acc_stderr": 0.010785412654517362, "acc_norm": 0.9321100917431193, "acc_norm_stderr": 0.010785412654517362 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6990740740740741, "acc_stderr": 0.03128039084329882, "acc_norm": 0.6990740740740741, "acc_norm_stderr": 0.03128039084329882 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9362745098039216, "acc_stderr": 0.01714392165552496, "acc_norm": 0.9362745098039216, "acc_norm_stderr": 0.01714392165552496 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9156118143459916, "acc_stderr": 0.018094247116473314, "acc_norm": 0.9156118143459916, "acc_norm_stderr": 0.018094247116473314 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7937219730941704, "acc_stderr": 0.02715715047956382, "acc_norm": 0.7937219730941704, "acc_norm_stderr": 0.02715715047956382 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8702290076335878, "acc_stderr": 0.029473649496907065, "acc_norm": 0.8702290076335878, "acc_norm_stderr": 0.029473649496907065 }, "harness|hendrycksTest-international_law|5": { "acc": 0.9008264462809917, "acc_stderr": 0.027285246312758957, "acc_norm": 0.9008264462809917, "acc_norm_stderr": 0.027285246312758957 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.9074074074074074, "acc_stderr": 0.028021888038609433, "acc_norm": 0.9074074074074074, "acc_norm_stderr": 0.028021888038609433 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8895705521472392, "acc_stderr": 0.024624937788941318, "acc_norm": 0.8895705521472392, "acc_norm_stderr": 0.024624937788941318 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.6517857142857143, "acc_stderr": 0.04521829902833585, "acc_norm": 0.6517857142857143, "acc_norm_stderr": 0.04521829902833585 }, "harness|hendrycksTest-management|5": { "acc": 0.883495145631068, "acc_stderr": 0.03176683948640405, "acc_norm": 0.883495145631068, "acc_norm_stderr": 0.03176683948640405 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9358974358974359, "acc_stderr": 0.016046261631673137, "acc_norm": 0.9358974358974359, "acc_norm_stderr": 0.016046261631673137 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.87, "acc_stderr": 0.03379976689896309, "acc_norm": 0.87, "acc_norm_stderr": 0.03379976689896309 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9118773946360154, "acc_stderr": 0.01013697820331264, "acc_norm": 0.9118773946360154, "acc_norm_stderr": 0.01013697820331264 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8208092485549133, "acc_stderr": 0.020647590029679332, "acc_norm": 0.8208092485549133, "acc_norm_stderr": 0.020647590029679332 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.794413407821229, "acc_stderr": 0.013516116210724202, "acc_norm": 0.794413407821229, "acc_norm_stderr": 0.013516116210724202 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8594771241830066, "acc_stderr": 0.019899435463539946, "acc_norm": 0.8594771241830066, "acc_norm_stderr": 0.019899435463539946 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8456591639871383, "acc_stderr": 0.020519050342084712, "acc_norm": 0.8456591639871383, "acc_norm_stderr": 0.020519050342084712 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8796296296296297, "acc_stderr": 0.01810541409432968, "acc_norm": 0.8796296296296297, "acc_norm_stderr": 0.01810541409432968 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6560283687943262, "acc_stderr": 0.028338017428611327, "acc_norm": 0.6560283687943262, "acc_norm_stderr": 0.028338017428611327 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.6440677966101694, "acc_stderr": 0.012228645537277573, "acc_norm": 0.6440677966101694, "acc_norm_stderr": 0.012228645537277573 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8235294117647058, "acc_stderr": 0.023157468308559345, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.023157468308559345 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.8349673202614379, "acc_stderr": 0.015017550799247322, "acc_norm": 0.8349673202614379, "acc_norm_stderr": 0.015017550799247322 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7363636363636363, "acc_stderr": 0.04220224692971987, "acc_norm": 0.7363636363636363, "acc_norm_stderr": 0.04220224692971987 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8489795918367347, "acc_stderr": 0.022923004094736858, "acc_norm": 0.8489795918367347, "acc_norm_stderr": 0.022923004094736858 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8855721393034826, "acc_stderr": 0.022509345325101706, "acc_norm": 0.8855721393034826, "acc_norm_stderr": 0.022509345325101706 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.030151134457776334, "acc_norm": 0.9, "acc_norm_stderr": 0.030151134457776334 }, "harness|hendrycksTest-virology|5": { "acc": 0.572289156626506, "acc_stderr": 0.03851597683718533, "acc_norm": 0.572289156626506, "acc_norm_stderr": 0.03851597683718533 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8771929824561403, "acc_stderr": 0.02517298435015577, "acc_norm": 0.8771929824561403, "acc_norm_stderr": 0.02517298435015577 }, "harness|truthfulqa:mc|0": { "mc1": 0.4602203182374541, "mc1_stderr": 0.01744801722396088, "mc2": 0.6144919168362304, "mc2_stderr": 0.015159547860602553 }, "harness|winogrande|5": { "acc": 0.8492501973164956, "acc_stderr": 0.010056094631479702 }, "harness|gsm8k|5": { "acc": 0.6338134950720242, "acc_stderr": 0.013270100238748831 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_fblgit__una-xaberius-34b-v1beta
[ "region:us" ]
2023-12-09T11:19:25+00:00
{"pretty_name": "Evaluation run of fblgit/una-xaberius-34b-v1beta", "dataset_summary": "Dataset automatically created during the evaluation run of model [fblgit/una-xaberius-34b-v1beta](https://huggingface.co/fblgit/una-xaberius-34b-v1beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fblgit__una-xaberius-34b-v1beta\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T11:16:37.904970](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__una-xaberius-34b-v1beta/blob/main/results_2023-12-09T11-16-37.904970.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7767755521790687,\n \"acc_stderr\": 0.027620532165746333,\n \"acc_norm\": 0.7816657153803744,\n \"acc_norm_stderr\": 0.02813443103457644,\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.01744801722396088,\n \"mc2\": 0.6144919168362304,\n \"mc2_stderr\": 0.015159547860602553\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6791808873720137,\n \"acc_stderr\": 0.013640943091946524,\n \"acc_norm\": 0.7039249146757679,\n \"acc_norm_stderr\": 0.013340916085246254\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6743676558454491,\n \"acc_stderr\": 0.004676529200753001,\n \"acc_norm\": 0.8676558454491137,\n \"acc_norm_stderr\": 0.0033817200071652002\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7481481481481481,\n \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.7481481481481481,\n \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.024974533450920693,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.024974533450920693\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8113207547169812,\n \"acc_stderr\": 0.02407999513006224,\n \"acc_norm\": 0.8113207547169812,\n \"acc_norm_stderr\": 0.02407999513006224\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7787234042553192,\n \"acc_stderr\": 0.02713634960242406,\n \"acc_norm\": 0.7787234042553192,\n \"acc_norm_stderr\": 0.02713634960242406\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7931034482758621,\n \"acc_stderr\": 0.03375672449560554,\n \"acc_norm\": 0.7931034482758621,\n \"acc_norm_stderr\": 0.03375672449560554\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7619047619047619,\n \"acc_stderr\": 0.02193587808118476,\n \"acc_norm\": 0.7619047619047619,\n \"acc_norm_stderr\": 0.02193587808118476\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9161290322580645,\n \"acc_stderr\": 0.015769027496775667,\n \"acc_norm\": 0.9161290322580645,\n \"acc_norm_stderr\": 0.015769027496775667\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03344283744280458,\n \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03344283744280458\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8848484848484849,\n \"acc_stderr\": 0.024925699798115344,\n \"acc_norm\": 0.8848484848484849,\n \"acc_norm_stderr\": 0.024925699798115344\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.0182631054201995,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.0182631054201995\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909036,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.841025641025641,\n \"acc_stderr\": 0.01853930114094035,\n \"acc_norm\": 0.841025641025641,\n \"acc_norm_stderr\": 0.01853930114094035\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.030417716961717474,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.030417716961717474\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8865546218487395,\n \"acc_stderr\": 0.02060022575020482,\n \"acc_norm\": 0.8865546218487395,\n \"acc_norm_stderr\": 0.02060022575020482\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5761589403973509,\n \"acc_stderr\": 0.04034846678603397,\n \"acc_norm\": 0.5761589403973509,\n \"acc_norm_stderr\": 0.04034846678603397\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9321100917431193,\n \"acc_stderr\": 0.010785412654517362,\n \"acc_norm\": 0.9321100917431193,\n \"acc_norm_stderr\": 0.010785412654517362\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6990740740740741,\n \"acc_stderr\": 0.03128039084329882,\n \"acc_norm\": 0.6990740740740741,\n \"acc_norm_stderr\": 0.03128039084329882\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9362745098039216,\n \"acc_stderr\": 0.01714392165552496,\n \"acc_norm\": 0.9362745098039216,\n \"acc_norm_stderr\": 0.01714392165552496\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9156118143459916,\n \"acc_stderr\": 0.018094247116473314,\n \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.018094247116473314\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9074074074074074,\n \"acc_stderr\": 0.028021888038609433,\n \"acc_norm\": 0.9074074074074074,\n \"acc_norm_stderr\": 0.028021888038609433\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8895705521472392,\n \"acc_stderr\": 0.024624937788941318,\n \"acc_norm\": 0.8895705521472392,\n \"acc_norm_stderr\": 0.024624937788941318\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6517857142857143,\n \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.6517857142857143,\n \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.03176683948640405,\n \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.03176683948640405\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9118773946360154,\n \"acc_stderr\": 0.01013697820331264,\n \"acc_norm\": 0.9118773946360154,\n \"acc_norm_stderr\": 0.01013697820331264\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.794413407821229,\n \"acc_stderr\": 0.013516116210724202,\n \"acc_norm\": 0.794413407821229,\n \"acc_norm_stderr\": 0.013516116210724202\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8594771241830066,\n \"acc_stderr\": 0.019899435463539946,\n \"acc_norm\": 0.8594771241830066,\n \"acc_norm_stderr\": 0.019899435463539946\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8456591639871383,\n \"acc_stderr\": 0.020519050342084712,\n \"acc_norm\": 0.8456591639871383,\n \"acc_norm_stderr\": 0.020519050342084712\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.01810541409432968,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.01810541409432968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6560283687943262,\n \"acc_stderr\": 0.028338017428611327,\n \"acc_norm\": 0.6560283687943262,\n \"acc_norm_stderr\": 0.028338017428611327\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6440677966101694,\n \"acc_stderr\": 0.012228645537277573,\n \"acc_norm\": 0.6440677966101694,\n \"acc_norm_stderr\": 0.012228645537277573\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.023157468308559345,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.023157468308559345\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8349673202614379,\n \"acc_stderr\": 0.015017550799247322,\n \"acc_norm\": 0.8349673202614379,\n \"acc_norm_stderr\": 0.015017550799247322\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8489795918367347,\n \"acc_stderr\": 0.022923004094736858,\n \"acc_norm\": 0.8489795918367347,\n \"acc_norm_stderr\": 0.022923004094736858\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.01744801722396088,\n \"mc2\": 0.6144919168362304,\n \"mc2_stderr\": 0.015159547860602553\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479702\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6338134950720242,\n \"acc_stderr\": 0.013270100238748831\n }\n}\n```", "repo_url": "https://huggingface.co/fblgit/una-xaberius-34b-v1beta", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|arc:challenge|25_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|gsm8k|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hellaswag|10_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T11-16-37.904970.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["**/details_harness|winogrande|5_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T11-16-37.904970.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T11_16_37.904970", "path": ["results_2023-12-09T11-16-37.904970.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T11-16-37.904970.parquet"]}]}]}
2023-12-09T11:20:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of fblgit/una-xaberius-34b-v1beta ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model fblgit/una-xaberius-34b-v1beta on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T11:16:37.904970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of fblgit/una-xaberius-34b-v1beta", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model fblgit/una-xaberius-34b-v1beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T11:16:37.904970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of fblgit/una-xaberius-34b-v1beta", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model fblgit/una-xaberius-34b-v1beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T11:16:37.904970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of fblgit/una-xaberius-34b-v1beta## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model fblgit/una-xaberius-34b-v1beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T11:16:37.904970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
bc7f61d06caedbb6abbdbdbb960a084e916128f5
# Dataset Card for "colorization" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
calibretaliation/colorization
[ "region:us" ]
2023-12-09T11:29:26+00:00
{"dataset_info": {"features": [{"name": "original_image", "dtype": "image"}, {"name": "edit_prompt", "dtype": "string"}, {"name": "colorized_image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 9367173024.435, "num_examples": 299335}], "download_size": 9409073504, "dataset_size": 9367173024.435}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-09T11:42:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for "colorization" More Information needed
[ "# Dataset Card for \"colorization\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"colorization\"\n\nMore Information needed" ]
[ 6, 12 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"colorization\"\n\nMore Information needed" ]
73581a7b1f9c69effcccbfd4bc24820e71a44428
This is the SyntheticTypeIdent dataset from the paper [Visual Data-Type Understanding does not emerge from Scaling Vision-Language Models](https://arxiv.org/abs/2310.08577)
bethgelab/SyntheticTypeIdent
[ "task_categories:image-classification", "size_categories:1K<n<10K", "language:en", "license:mit", "arxiv:2310.08577", "region:us" ]
2023-12-09T11:51:30+00:00
{"language": ["en"], "license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["image-classification"], "pretty_name": "SyntheticTypeIdent"}
2024-01-17T20:27:59+00:00
[ "2310.08577" ]
[ "en" ]
TAGS #task_categories-image-classification #size_categories-1K<n<10K #language-English #license-mit #arxiv-2310.08577 #region-us
This is the SyntheticTypeIdent dataset from the paper Visual Data-Type Understanding does not emerge from Scaling Vision-Language Models
[]
[ "TAGS\n#task_categories-image-classification #size_categories-1K<n<10K #language-English #license-mit #arxiv-2310.08577 #region-us \n" ]
[ 47 ]
[ "passage: TAGS\n#task_categories-image-classification #size_categories-1K<n<10K #language-English #license-mit #arxiv-2310.08577 #region-us \n" ]
acc869a5db225a4fd15476f84b2c9fb5fefed9a9
# Dataset Card for "semeval-task-8-b-v2-test-paraphrase" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
kpriyanshu256/semeval-task-8-b-v2-test-paraphrase
[ "region:us" ]
2023-12-09T12:40:04+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "model", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "label", "dtype": "int64"}, {"name": "id", "dtype": "int64"}, {"name": "paraphrase", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 8206381, "num_examples": 3000}], "download_size": 3712031, "dataset_size": 8206381}}
2023-12-09T12:40:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for "semeval-task-8-b-v2-test-paraphrase" More Information needed
[ "# Dataset Card for \"semeval-task-8-b-v2-test-paraphrase\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"semeval-task-8-b-v2-test-paraphrase\"\n\nMore Information needed" ]
[ 6, 27 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"semeval-task-8-b-v2-test-paraphrase\"\n\nMore Information needed" ]
1b113450aa31dcef302f39e37ff65abee3b93778
# Dataset Card for Evaluation run of bongchoi/MoMo-70B-LoRA-V1.1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/bongchoi/MoMo-70B-LoRA-V1.1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [bongchoi/MoMo-70B-LoRA-V1.1](https://huggingface.co/bongchoi/MoMo-70B-LoRA-V1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_bongchoi__MoMo-70B-LoRA-V1.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T12:57:45.545472](https://huggingface.co/datasets/open-llm-leaderboard/details_bongchoi__MoMo-70B-LoRA-V1.1/blob/main/results_2023-12-09T12-57-45.545472.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6664015038261123, "acc_stderr": 0.03100518950503553, "acc_norm": 0.6709955972660262, "acc_norm_stderr": 0.03161767962532908, "mc1": 0.38310893512851896, "mc1_stderr": 0.01701846167938986, "mc2": 0.5498134743824588, "mc2_stderr": 0.014557620111754889 }, "harness|arc:challenge|25": { "acc": 0.6313993174061433, "acc_stderr": 0.014097810678042194, "acc_norm": 0.6663822525597269, "acc_norm_stderr": 0.013778687054176534 }, "harness|hellaswag|10": { "acc": 0.6716789484166501, "acc_stderr": 0.004686425851253281, "acc_norm": 0.8716391157140012, "acc_norm_stderr": 0.0033380760156172563 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.042561937679014075, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.042561937679014075 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.75, "acc_stderr": 0.03523807393012047, "acc_norm": 0.75, "acc_norm_stderr": 0.03523807393012047 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.73, "acc_stderr": 0.04461960433384741, "acc_norm": 0.73, "acc_norm_stderr": 0.04461960433384741 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6716981132075471, "acc_stderr": 0.028901593612411784, "acc_norm": 0.6716981132075471, "acc_norm_stderr": 0.028901593612411784 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8125, "acc_stderr": 0.032639560491693344, "acc_norm": 0.8125, "acc_norm_stderr": 0.032639560491693344 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.049999999999999996, "acc_norm": 0.45, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.036146654241808254, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.036146654241808254 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3235294117647059, "acc_stderr": 0.046550104113196177, "acc_norm": 0.3235294117647059, "acc_norm_stderr": 0.046550104113196177 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.37719298245614036, "acc_stderr": 0.04559522141958216, "acc_norm": 0.37719298245614036, "acc_norm_stderr": 0.04559522141958216 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6068965517241379, "acc_stderr": 0.0407032901370707, "acc_norm": 0.6068965517241379, "acc_norm_stderr": 0.0407032901370707 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3862433862433862, "acc_stderr": 0.02507598176760168, "acc_norm": 0.3862433862433862, "acc_norm_stderr": 0.02507598176760168 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.49206349206349204, "acc_stderr": 0.044715725362943486, "acc_norm": 0.49206349206349204, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8096774193548387, "acc_stderr": 0.02233170761182307, "acc_norm": 0.8096774193548387, "acc_norm_stderr": 0.02233170761182307 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8585858585858586, "acc_stderr": 0.02482590979334334, "acc_norm": 0.8585858585858586, "acc_norm_stderr": 0.02482590979334334 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9326424870466321, "acc_stderr": 0.018088393839078915, "acc_norm": 0.9326424870466321, "acc_norm_stderr": 0.018088393839078915 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7025641025641025, "acc_stderr": 0.02317740813146595, "acc_norm": 0.7025641025641025, "acc_norm_stderr": 0.02317740813146595 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.028661201116524572, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.028661201116524572 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7310924369747899, "acc_stderr": 0.028801392193631276, "acc_norm": 0.7310924369747899, "acc_norm_stderr": 0.028801392193631276 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.0395802723112157, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.0395802723112157 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8403669724770643, "acc_stderr": 0.015703498348461763, "acc_norm": 0.8403669724770643, "acc_norm_stderr": 0.015703498348461763 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5833333333333334, "acc_stderr": 0.033622774366080424, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.033622774366080424 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8725490196078431, "acc_stderr": 0.023405530480846325, "acc_norm": 0.8725490196078431, "acc_norm_stderr": 0.023405530480846325 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8776371308016878, "acc_stderr": 0.021331741829746786, "acc_norm": 0.8776371308016878, "acc_norm_stderr": 0.021331741829746786 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7533632286995515, "acc_stderr": 0.02893041312091088, "acc_norm": 0.7533632286995515, "acc_norm_stderr": 0.02893041312091088 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.03498149385462471, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.03498149385462471 }, "harness|hendrycksTest-international_law|5": { "acc": 0.859504132231405, "acc_stderr": 0.03172233426002158, "acc_norm": 0.859504132231405, "acc_norm_stderr": 0.03172233426002158 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8365261813537676, "acc_stderr": 0.013223928616741617, "acc_norm": 0.8365261813537676, "acc_norm_stderr": 0.013223928616741617 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7196531791907514, "acc_stderr": 0.024182427496577605, "acc_norm": 0.7196531791907514, "acc_norm_stderr": 0.024182427496577605 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43910614525139663, "acc_stderr": 0.016598022120580428, "acc_norm": 0.43910614525139663, "acc_norm_stderr": 0.016598022120580428 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7679738562091504, "acc_stderr": 0.024170840879340873, "acc_norm": 0.7679738562091504, "acc_norm_stderr": 0.024170840879340873 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7556270096463023, "acc_stderr": 0.02440616209466889, "acc_norm": 0.7556270096463023, "acc_norm_stderr": 0.02440616209466889 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7901234567901234, "acc_stderr": 0.02265834408598137, "acc_norm": 0.7901234567901234, "acc_norm_stderr": 0.02265834408598137 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5070921985815603, "acc_stderr": 0.02982449855912901, "acc_norm": 0.5070921985815603, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5019556714471969, "acc_stderr": 0.012770138422208636, "acc_norm": 0.5019556714471969, "acc_norm_stderr": 0.012770138422208636 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7205882352941176, "acc_stderr": 0.027257202606114948, "acc_norm": 0.7205882352941176, "acc_norm_stderr": 0.027257202606114948 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7434640522875817, "acc_stderr": 0.017667841612379005, "acc_norm": 0.7434640522875817, "acc_norm_stderr": 0.017667841612379005 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7181818181818181, "acc_stderr": 0.04309118709946458, "acc_norm": 0.7181818181818181, "acc_norm_stderr": 0.04309118709946458 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7673469387755102, "acc_stderr": 0.02704925791589618, "acc_norm": 0.7673469387755102, "acc_norm_stderr": 0.02704925791589618 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8706467661691543, "acc_stderr": 0.023729830881018515, "acc_norm": 0.8706467661691543, "acc_norm_stderr": 0.023729830881018515 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.94, "acc_stderr": 0.023868325657594162, "acc_norm": 0.94, "acc_norm_stderr": 0.023868325657594162 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8538011695906432, "acc_stderr": 0.027097290118070806, "acc_norm": 0.8538011695906432, "acc_norm_stderr": 0.027097290118070806 }, "harness|truthfulqa:mc|0": { "mc1": 0.38310893512851896, "mc1_stderr": 0.01701846167938986, "mc2": 0.5498134743824588, "mc2_stderr": 0.014557620111754889 }, "harness|winogrande|5": { "acc": 0.8334648776637726, "acc_stderr": 0.010470796496781096 }, "harness|gsm8k|5": { "acc": 0.4632297194844579, "acc_stderr": 0.013735191956468648 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_bongchoi__MoMo-70B-LoRA-V1.1
[ "region:us" ]
2023-12-09T13:00:45+00:00
{"pretty_name": "Evaluation run of bongchoi/MoMo-70B-LoRA-V1.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [bongchoi/MoMo-70B-LoRA-V1.1](https://huggingface.co/bongchoi/MoMo-70B-LoRA-V1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bongchoi__MoMo-70B-LoRA-V1.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T12:57:45.545472](https://huggingface.co/datasets/open-llm-leaderboard/details_bongchoi__MoMo-70B-LoRA-V1.1/blob/main/results_2023-12-09T12-57-45.545472.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6664015038261123,\n \"acc_stderr\": 0.03100518950503553,\n \"acc_norm\": 0.6709955972660262,\n \"acc_norm_stderr\": 0.03161767962532908,\n \"mc1\": 0.38310893512851896,\n \"mc1_stderr\": 0.01701846167938986,\n \"mc2\": 0.5498134743824588,\n \"mc2_stderr\": 0.014557620111754889\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042194,\n \"acc_norm\": 0.6663822525597269,\n \"acc_norm_stderr\": 0.013778687054176534\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6716789484166501,\n \"acc_stderr\": 0.004686425851253281,\n \"acc_norm\": 0.8716391157140012,\n \"acc_norm_stderr\": 0.0033380760156172563\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.028901593612411784,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.028901593612411784\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3862433862433862,\n \"acc_stderr\": 0.02507598176760168,\n \"acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.02507598176760168\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.02233170761182307,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.02233170761182307\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8585858585858586,\n \"acc_stderr\": 0.02482590979334334,\n \"acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.02482590979334334\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078915,\n \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078915\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.02317740813146595,\n \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.02317740813146595\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7310924369747899,\n \"acc_stderr\": 0.028801392193631276,\n \"acc_norm\": 0.7310924369747899,\n \"acc_norm_stderr\": 0.028801392193631276\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.033622774366080424,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.033622774366080424\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8725490196078431,\n \"acc_stderr\": 0.023405530480846325,\n \"acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.023405530480846325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746786,\n \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746786\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7533632286995515,\n \"acc_stderr\": 0.02893041312091088,\n \"acc_norm\": 0.7533632286995515,\n \"acc_norm_stderr\": 0.02893041312091088\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462471,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462471\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\": 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n \"acc_stderr\": 0.013223928616741617,\n \"acc_norm\": 0.8365261813537676,\n \"acc_norm_stderr\": 0.013223928616741617\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n \"acc_stderr\": 0.016598022120580428,\n \"acc_norm\": 0.43910614525139663,\n \"acc_norm_stderr\": 0.016598022120580428\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7556270096463023,\n \"acc_stderr\": 0.02440616209466889,\n \"acc_norm\": 0.7556270096463023,\n \"acc_norm_stderr\": 0.02440616209466889\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7901234567901234,\n \"acc_stderr\": 0.02265834408598137,\n \"acc_norm\": 0.7901234567901234,\n \"acc_norm_stderr\": 0.02265834408598137\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5019556714471969,\n \"acc_stderr\": 0.012770138422208636,\n \"acc_norm\": 0.5019556714471969,\n \"acc_norm_stderr\": 0.012770138422208636\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.027257202606114948,\n \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.027257202606114948\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7434640522875817,\n \"acc_stderr\": 0.017667841612379005,\n \"acc_norm\": 0.7434640522875817,\n \"acc_norm_stderr\": 0.017667841612379005\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.02704925791589618,\n \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.02704925791589618\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.023729830881018515,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.023729830881018515\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594162,\n \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594162\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38310893512851896,\n \"mc1_stderr\": 0.01701846167938986,\n \"mc2\": 0.5498134743824588,\n \"mc2_stderr\": 0.014557620111754889\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781096\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4632297194844579,\n \"acc_stderr\": 0.013735191956468648\n }\n}\n```", "repo_url": "https://huggingface.co/bongchoi/MoMo-70B-LoRA-V1.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|arc:challenge|25_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|gsm8k|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hellaswag|10_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T12-57-45.545472.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["**/details_harness|winogrande|5_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T12-57-45.545472.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T12_57_45.545472", "path": ["results_2023-12-09T12-57-45.545472.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T12-57-45.545472.parquet"]}]}]}
2023-12-09T13:01:27+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of bongchoi/MoMo-70B-LoRA-V1.1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model bongchoi/MoMo-70B-LoRA-V1.1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T12:57:45.545472(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of bongchoi/MoMo-70B-LoRA-V1.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model bongchoi/MoMo-70B-LoRA-V1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T12:57:45.545472(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of bongchoi/MoMo-70B-LoRA-V1.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model bongchoi/MoMo-70B-LoRA-V1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T12:57:45.545472(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bongchoi/MoMo-70B-LoRA-V1.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model bongchoi/MoMo-70B-LoRA-V1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T12:57:45.545472(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
be3bed269179b338b8a7ba3f386609d30e58da6b
# Touch Rugby Rules Dataset (for embeddings) train.csv is taken from the [International Touch Website](https://cdn.internationaltouch.org/public/FIT%205th%20Edition%20Rulebook.pdf) test.csv is copy pasted from abbreviated rules on the [UK Touch website](https://www.englandtouch.org.uk/develop/coaching/the-rules/). Note that I'm bypassing the pdf to text stage. All text is chunked to a length of 100 tokens with 50% overlap. For educational and non-commercial use only.
acazau/touch-rugby-rules-embeddings
[ "task_categories:text-generation", "size_categories:n<1K", "language:en", "fine-tuning", "touch rugby", "region:us" ]
2023-12-09T13:13:02+00:00
{"language": ["en"], "size_categories": ["n<1K"], "task_categories": ["text-generation"], "tags": ["fine-tuning", "touch rugby"]}
2023-12-09T13:15:24+00:00
[]
[ "en" ]
TAGS #task_categories-text-generation #size_categories-n<1K #language-English #fine-tuning #touch rugby #region-us
# Touch Rugby Rules Dataset (for embeddings) URL is taken from the International Touch Website URL is copy pasted from abbreviated rules on the UK Touch website. Note that I'm bypassing the pdf to text stage. All text is chunked to a length of 100 tokens with 50% overlap. For educational and non-commercial use only.
[ "# Touch Rugby Rules Dataset (for embeddings)\n\nURL is taken from the International Touch Website\n\nURL is copy pasted from abbreviated rules on the UK Touch website. Note that I'm bypassing the pdf to text stage.\n\nAll text is chunked to a length of 100 tokens with 50% overlap.\n\nFor educational and non-commercial use only." ]
[ "TAGS\n#task_categories-text-generation #size_categories-n<1K #language-English #fine-tuning #touch rugby #region-us \n", "# Touch Rugby Rules Dataset (for embeddings)\n\nURL is taken from the International Touch Website\n\nURL is copy pasted from abbreviated rules on the UK Touch website. Note that I'm bypassing the pdf to text stage.\n\nAll text is chunked to a length of 100 tokens with 50% overlap.\n\nFor educational and non-commercial use only." ]
[ 39, 81 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-n<1K #language-English #fine-tuning #touch rugby #region-us \n# Touch Rugby Rules Dataset (for embeddings)\n\nURL is taken from the International Touch Website\n\nURL is copy pasted from abbreviated rules on the UK Touch website. Note that I'm bypassing the pdf to text stage.\n\nAll text is chunked to a length of 100 tokens with 50% overlap.\n\nFor educational and non-commercial use only." ]
e9b9a8405759290874fe8936d96ce0e256939da5
# Dataset Card for "librispeech_asr-audiodec_dac_16k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
anthony-wss/librispeech_asr-audiodec_dac_16k
[ "region:us" ]
2023-12-09T13:13:11+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train.clean.360", "path": "data/train.clean.360-*"}, {"split": "train.other.500", "path": "data/train.other.500-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "unit", "sequence": {"sequence": "int64"}}], "splits": [{"name": "train.clean.360", "num_bytes": 12984026772, "num_examples": 104014}, {"name": "train.other.500", "num_bytes": 17752431057, "num_examples": 148688}], "download_size": 4787888994, "dataset_size": 30736457829}}
2023-12-09T13:23:53+00:00
[]
[]
TAGS #region-us
# Dataset Card for "librispeech_asr-audiodec_dac_16k" More Information needed
[ "# Dataset Card for \"librispeech_asr-audiodec_dac_16k\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"librispeech_asr-audiodec_dac_16k\"\n\nMore Information needed" ]
[ 6, 26 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"librispeech_asr-audiodec_dac_16k\"\n\nMore Information needed" ]
98f41256882393039a10976e02152a0d4327465a
# Dataset Card for Evaluation run of jondurbin/spicyboros-70b-2.2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/jondurbin/spicyboros-70b-2.2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [jondurbin/spicyboros-70b-2.2](https://huggingface.co/jondurbin/spicyboros-70b-2.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jondurbin__spicyboros-70b-2.2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T13:35:58.790771](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__spicyboros-70b-2.2/blob/main/results_2023-12-09T13-35-58.790771.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6994977035152228, "acc_stderr": 0.03000980046510645, "acc_norm": 0.7061676972820593, "acc_norm_stderr": 0.03058946235753942, "mc1": 0.40758873929008566, "mc1_stderr": 0.017201949234553107, "mc2": 0.5830864962460272, "mc2_stderr": 0.015064621118044078 }, "harness|arc:challenge|25": { "acc": 0.6569965870307167, "acc_stderr": 0.013872423223718166, "acc_norm": 0.7073378839590444, "acc_norm_stderr": 0.013295916103619422 }, "harness|hellaswag|10": { "acc": 0.6840270862378013, "acc_stderr": 0.004639520453444027, "acc_norm": 0.8758215494921331, "acc_norm_stderr": 0.0032911103784119644 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8355263157894737, "acc_stderr": 0.030167533468632723, "acc_norm": 0.8355263157894737, "acc_norm_stderr": 0.030167533468632723 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.76, "acc_stderr": 0.04292346959909284, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8472222222222222, "acc_stderr": 0.030085743248565666, "acc_norm": 0.8472222222222222, "acc_norm_stderr": 0.030085743248565666 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6878612716763006, "acc_stderr": 0.035331333893236574, "acc_norm": 0.6878612716763006, "acc_norm_stderr": 0.035331333893236574 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.04784060704105654, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.04784060704105654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932262, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932262 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7063829787234043, "acc_stderr": 0.029771642712491227, "acc_norm": 0.7063829787234043, "acc_norm_stderr": 0.029771642712491227 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.42105263157894735, "acc_stderr": 0.046446020912223177, "acc_norm": 0.42105263157894735, "acc_norm_stderr": 0.046446020912223177 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6344827586206897, "acc_stderr": 0.040131241954243856, "acc_norm": 0.6344827586206897, "acc_norm_stderr": 0.040131241954243856 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.43915343915343913, "acc_stderr": 0.02555992055053101, "acc_norm": 0.43915343915343913, "acc_norm_stderr": 0.02555992055053101 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5079365079365079, "acc_stderr": 0.044715725362943486, "acc_norm": 0.5079365079365079, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8193548387096774, "acc_stderr": 0.021886178567172527, "acc_norm": 0.8193548387096774, "acc_norm_stderr": 0.021886178567172527 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.035179450386910616, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8303030303030303, "acc_stderr": 0.029311188674983127, "acc_norm": 0.8303030303030303, "acc_norm_stderr": 0.029311188674983127 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8838383838383839, "acc_stderr": 0.022828881775249377, "acc_norm": 0.8838383838383839, "acc_norm_stderr": 0.022828881775249377 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9378238341968912, "acc_stderr": 0.017426974154240528, "acc_norm": 0.9378238341968912, "acc_norm_stderr": 0.017426974154240528 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7282051282051282, "acc_stderr": 0.022556551010132354, "acc_norm": 0.7282051282051282, "acc_norm_stderr": 0.022556551010132354 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3592592592592593, "acc_stderr": 0.029252905927251976, "acc_norm": 0.3592592592592593, "acc_norm_stderr": 0.029252905927251976 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7773109243697479, "acc_stderr": 0.02702543349888238, "acc_norm": 0.7773109243697479, "acc_norm_stderr": 0.02702543349888238 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.46357615894039733, "acc_stderr": 0.04071636065944215, "acc_norm": 0.46357615894039733, "acc_norm_stderr": 0.04071636065944215 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8990825688073395, "acc_stderr": 0.01291467354536444, "acc_norm": 0.8990825688073395, "acc_norm_stderr": 0.01291467354536444 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5833333333333334, "acc_stderr": 0.033622774366080424, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.033622774366080424 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9215686274509803, "acc_stderr": 0.018869514646658928, "acc_norm": 0.9215686274509803, "acc_norm_stderr": 0.018869514646658928 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.890295358649789, "acc_stderr": 0.020343400734868834, "acc_norm": 0.890295358649789, "acc_norm_stderr": 0.020343400734868834 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.8071748878923767, "acc_stderr": 0.026478240960489365, "acc_norm": 0.8071748878923767, "acc_norm_stderr": 0.026478240960489365 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8778625954198473, "acc_stderr": 0.028718776889342327, "acc_norm": 0.8778625954198473, "acc_norm_stderr": 0.028718776889342327 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8677685950413223, "acc_stderr": 0.03092278832044579, "acc_norm": 0.8677685950413223, "acc_norm_stderr": 0.03092278832044579 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8333333333333334, "acc_stderr": 0.03602814176392645, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.03602814176392645 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8098159509202454, "acc_stderr": 0.030833491146281235, "acc_norm": 0.8098159509202454, "acc_norm_stderr": 0.030833491146281235 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8252427184466019, "acc_stderr": 0.0376017800602662, "acc_norm": 0.8252427184466019, "acc_norm_stderr": 0.0376017800602662 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8974358974358975, "acc_stderr": 0.019875655027867443, "acc_norm": 0.8974358974358975, "acc_norm_stderr": 0.019875655027867443 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8722860791826309, "acc_stderr": 0.011935626313999876, "acc_norm": 0.8722860791826309, "acc_norm_stderr": 0.011935626313999876 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7803468208092486, "acc_stderr": 0.02228963885261789, "acc_norm": 0.7803468208092486, "acc_norm_stderr": 0.02228963885261789 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.5620111731843576, "acc_stderr": 0.01659339422756484, "acc_norm": 0.5620111731843576, "acc_norm_stderr": 0.01659339422756484 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7810457516339869, "acc_stderr": 0.02367908986180772, "acc_norm": 0.7810457516339869, "acc_norm_stderr": 0.02367908986180772 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.77491961414791, "acc_stderr": 0.023720088516179027, "acc_norm": 0.77491961414791, "acc_norm_stderr": 0.023720088516179027 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8518518518518519, "acc_stderr": 0.01976645956359726, "acc_norm": 0.8518518518518519, "acc_norm_stderr": 0.01976645956359726 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5815602836879432, "acc_stderr": 0.029427994039420004, "acc_norm": 0.5815602836879432, "acc_norm_stderr": 0.029427994039420004 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5521512385919165, "acc_stderr": 0.012700582404768235, "acc_norm": 0.5521512385919165, "acc_norm_stderr": 0.012700582404768235 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7426470588235294, "acc_stderr": 0.026556519470041503, "acc_norm": 0.7426470588235294, "acc_norm_stderr": 0.026556519470041503 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.761437908496732, "acc_stderr": 0.01724238582877962, "acc_norm": 0.761437908496732, "acc_norm_stderr": 0.01724238582877962 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7181818181818181, "acc_stderr": 0.043091187099464585, "acc_norm": 0.7181818181818181, "acc_norm_stderr": 0.043091187099464585 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8163265306122449, "acc_stderr": 0.024789071332007636, "acc_norm": 0.8163265306122449, "acc_norm_stderr": 0.024789071332007636 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8855721393034826, "acc_stderr": 0.022509345325101716, "acc_norm": 0.8855721393034826, "acc_norm_stderr": 0.022509345325101716 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.92, "acc_stderr": 0.0272659924344291, "acc_norm": 0.92, "acc_norm_stderr": 0.0272659924344291 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8830409356725146, "acc_stderr": 0.024648068961366176, "acc_norm": 0.8830409356725146, "acc_norm_stderr": 0.024648068961366176 }, "harness|truthfulqa:mc|0": { "mc1": 0.40758873929008566, "mc1_stderr": 0.017201949234553107, "mc2": 0.5830864962460272, "mc2_stderr": 0.015064621118044078 }, "harness|winogrande|5": { "acc": 0.8382004735595896, "acc_stderr": 0.010350128010292402 }, "harness|gsm8k|5": { "acc": 0.4094010614101592, "acc_stderr": 0.013544504071244504 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_jondurbin__spicyboros-70b-2.2
[ "region:us" ]
2023-12-09T13:38:58+00:00
{"pretty_name": "Evaluation run of jondurbin/spicyboros-70b-2.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/spicyboros-70b-2.2](https://huggingface.co/jondurbin/spicyboros-70b-2.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__spicyboros-70b-2.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T13:35:58.790771](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__spicyboros-70b-2.2/blob/main/results_2023-12-09T13-35-58.790771.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6994977035152228,\n \"acc_stderr\": 0.03000980046510645,\n \"acc_norm\": 0.7061676972820593,\n \"acc_norm_stderr\": 0.03058946235753942,\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.5830864962460272,\n \"mc2_stderr\": 0.015064621118044078\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6569965870307167,\n \"acc_stderr\": 0.013872423223718166,\n \"acc_norm\": 0.7073378839590444,\n \"acc_norm_stderr\": 0.013295916103619422\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6840270862378013,\n \"acc_stderr\": 0.004639520453444027,\n \"acc_norm\": 0.8758215494921331,\n \"acc_norm_stderr\": 0.0032911103784119644\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8355263157894737,\n \"acc_stderr\": 0.030167533468632723,\n \"acc_norm\": 0.8355263157894737,\n \"acc_norm_stderr\": 0.030167533468632723\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n \"acc_stderr\": 0.030085743248565666,\n \"acc_norm\": 0.8472222222222222,\n \"acc_norm_stderr\": 0.030085743248565666\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7063829787234043,\n \"acc_stderr\": 0.029771642712491227,\n \"acc_norm\": 0.7063829787234043,\n \"acc_norm_stderr\": 0.029771642712491227\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43915343915343913,\n \"acc_stderr\": 0.02555992055053101,\n \"acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.02555992055053101\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n \"acc_stderr\": 0.021886178567172527,\n \"acc_norm\": 0.8193548387096774,\n \"acc_norm_stderr\": 0.021886178567172527\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.029311188674983127,\n \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.029311188674983127\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8838383838383839,\n \"acc_stderr\": 0.022828881775249377,\n \"acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.022828881775249377\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7282051282051282,\n \"acc_stderr\": 0.022556551010132354,\n \"acc_norm\": 0.7282051282051282,\n \"acc_norm_stderr\": 0.022556551010132354\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7773109243697479,\n \"acc_stderr\": 0.02702543349888238,\n \"acc_norm\": 0.7773109243697479,\n \"acc_norm_stderr\": 0.02702543349888238\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944215,\n \"acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944215\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8990825688073395,\n \"acc_stderr\": 0.01291467354536444,\n \"acc_norm\": 0.8990825688073395,\n \"acc_norm_stderr\": 0.01291467354536444\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.033622774366080424,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.033622774366080424\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.890295358649789,\n \"acc_stderr\": 0.020343400734868834,\n \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.020343400734868834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342327,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342327\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.030833491146281235,\n \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.030833491146281235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.019875655027867443,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.019875655027867443\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8722860791826309,\n \"acc_stderr\": 0.011935626313999876,\n \"acc_norm\": 0.8722860791826309,\n \"acc_norm_stderr\": 0.011935626313999876\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.02228963885261789,\n \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.02228963885261789\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5620111731843576,\n \"acc_stderr\": 0.01659339422756484,\n \"acc_norm\": 0.5620111731843576,\n \"acc_norm_stderr\": 0.01659339422756484\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7810457516339869,\n \"acc_stderr\": 0.02367908986180772,\n \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.02367908986180772\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.01976645956359726,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.01976645956359726\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5815602836879432,\n \"acc_stderr\": 0.029427994039420004,\n \"acc_norm\": 0.5815602836879432,\n \"acc_norm_stderr\": 0.029427994039420004\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5521512385919165,\n \"acc_stderr\": 0.012700582404768235,\n \"acc_norm\": 0.5521512385919165,\n \"acc_norm_stderr\": 0.012700582404768235\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.026556519470041503,\n \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.026556519470041503\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.01724238582877962,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.01724238582877962\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8163265306122449,\n \"acc_stderr\": 0.024789071332007636,\n \"acc_norm\": 0.8163265306122449,\n \"acc_norm_stderr\": 0.024789071332007636\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101716,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101716\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366176,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366176\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.5830864962460272,\n \"mc2_stderr\": 0.015064621118044078\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8382004735595896,\n \"acc_stderr\": 0.010350128010292402\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4094010614101592,\n \"acc_stderr\": 0.013544504071244504\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/spicyboros-70b-2.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|arc:challenge|25_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|gsm8k|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hellaswag|10_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T13-35-58.790771.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["**/details_harness|winogrande|5_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T13-35-58.790771.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T13_35_58.790771", "path": ["results_2023-12-09T13-35-58.790771.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T13-35-58.790771.parquet"]}]}]}
2023-12-09T13:39:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of jondurbin/spicyboros-70b-2.2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model jondurbin/spicyboros-70b-2.2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T13:35:58.790771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of jondurbin/spicyboros-70b-2.2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/spicyboros-70b-2.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T13:35:58.790771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of jondurbin/spicyboros-70b-2.2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/spicyboros-70b-2.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T13:35:58.790771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 170, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/spicyboros-70b-2.2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/spicyboros-70b-2.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T13:35:58.790771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
b61192d1a5790843425862111a0eb870d911e8a9
# Enhanced GRID Corpus with Lip Landmark Coordinates ## Introduction This enhanced version of the GRID audiovisual sentence corpus, originally available at [Zenodo](https://zenodo.org/records/3625687), incorporates significant new features for auditory-visual speech recognition research. Building upon the preprocessed data from [LipNet-PyTorch](https://github.com/VIPL-Audio-Visual-Speech-Understanding/LipNet-PyTorch), we have added lip landmark coordinates to the dataset, providing detailed positional information of key points around the lips. This addition greatly enhances its utility in visual speech recognition and related fields. Furthermore, to facilitate ease of access and integration into existing machine learning workflows, we have published this enriched dataset on the Hugging Face platform, making it readily available to the research community. ## Dataset Structure This dataset is split into 3 directories: - `lip_images`: contains the images of the lips - `speaker_id`: contains the videos of a particular speaker - `video_id`: contains the video frames of a particular video - `frame_no.jpg`: jpg image of the lips of a particular frame - `lip_coordinates`: contains the landmark coordinates of the lips - `speaker_id`: contains the lip landmark of a particular speaker - `video_id.json`: a json file containing the lip landmark coordinates of a particular video, where the keys are the frame numbers and the values are the x, y lip landmark coordinates - `GRID_alignments`: contains the alignments of all the videos in the dataset - `speaker_id`: contains the alignments of a particular speaker - `video_id.align`: contains the alignments of a particular video, where each line is a word and the start and end time of the word in the video ## Details The lip landmark coordinates are extracted using the original videos in the GRID corpus and using the dlib library, using the [shape_predictor_68_face_landmarks_GTX.dat](https://github.com/davisking/dlib-models) pretrained model. The lip landmark coordinates are then saved in a json file, where the keys are the frame numbers and the values are the x, y lip landmark coordinates. The lip landmark coordinates are saved in the same order as the frames in the video. ## Usage The dataset can be downloaded by cloning this repository. ### Cloning the repository ```bash git clone https://huggingface.co/datasets/SilentSpeak/EGCLLC ``` ### Loading the dataset After cloning the repository, you can load the dataset by unpacking the tar file and using dataset_tar.py script. Alternatively, a probably faster method is that, you can un-tar the tar files using the following command: ```bash tar -xvf lip_images.tar tar -xvf lip_coordinates.tar tar -xvf GRID_alignments.tar ``` ## Acknowledgements Alvarez Casado, C., Bordallo Lopez, M. Real-time face alignment: evaluation methods, training strategies and implementation optimization. Springer Journal of Real-time image processing, 2021 Assael, Y., Shillingford, B., Whiteson, S., & Freitas, N. (2017). LipNet: End-to-End Sentence-level Lipreading. GPU Technology Conference. Cooke, M., Barker, J., Cunningham, S., & Shao, X. (2006). The Grid Audio-Visual Speech Corpus (1.0) [Data set]. Zenodo. https://doi.org/10.5281/zenodo.3625687
SilentSpeak/EGCLLC
[ "size_categories:10K<n<100K", "language:en", "license:cc-by-4.0", "region:us" ]
2023-12-09T13:55:52+00:00
{"language": ["en"], "license": "cc-by-4.0", "size_categories": ["10K<n<100K"]}
2023-12-11T06:48:54+00:00
[]
[ "en" ]
TAGS #size_categories-10K<n<100K #language-English #license-cc-by-4.0 #region-us
# Enhanced GRID Corpus with Lip Landmark Coordinates ## Introduction This enhanced version of the GRID audiovisual sentence corpus, originally available at Zenodo, incorporates significant new features for auditory-visual speech recognition research. Building upon the preprocessed data from LipNet-PyTorch, we have added lip landmark coordinates to the dataset, providing detailed positional information of key points around the lips. This addition greatly enhances its utility in visual speech recognition and related fields. Furthermore, to facilitate ease of access and integration into existing machine learning workflows, we have published this enriched dataset on the Hugging Face platform, making it readily available to the research community. ## Dataset Structure This dataset is split into 3 directories: - 'lip_images': contains the images of the lips - 'speaker_id': contains the videos of a particular speaker - 'video_id': contains the video frames of a particular video - 'frame_no.jpg': jpg image of the lips of a particular frame - 'lip_coordinates': contains the landmark coordinates of the lips - 'speaker_id': contains the lip landmark of a particular speaker - 'video_id.json': a json file containing the lip landmark coordinates of a particular video, where the keys are the frame numbers and the values are the x, y lip landmark coordinates - 'GRID_alignments': contains the alignments of all the videos in the dataset - 'speaker_id': contains the alignments of a particular speaker - 'video_id.align': contains the alignments of a particular video, where each line is a word and the start and end time of the word in the video ## Details The lip landmark coordinates are extracted using the original videos in the GRID corpus and using the dlib library, using the shape_predictor_68_face_landmarks_GTX.dat pretrained model. The lip landmark coordinates are then saved in a json file, where the keys are the frame numbers and the values are the x, y lip landmark coordinates. The lip landmark coordinates are saved in the same order as the frames in the video. ## Usage The dataset can be downloaded by cloning this repository. ### Cloning the repository ### Loading the dataset After cloning the repository, you can load the dataset by unpacking the tar file and using dataset_tar.py script. Alternatively, a probably faster method is that, you can un-tar the tar files using the following command: ## Acknowledgements Alvarez Casado, C., Bordallo Lopez, M. Real-time face alignment: evaluation methods, training strategies and implementation optimization. Springer Journal of Real-time image processing, 2021 Assael, Y., Shillingford, B., Whiteson, S., & Freitas, N. (2017). LipNet: End-to-End Sentence-level Lipreading. GPU Technology Conference. Cooke, M., Barker, J., Cunningham, S., & Shao, X. (2006). The Grid Audio-Visual Speech Corpus (1.0) [Data set]. Zenodo. URL
[ "# Enhanced GRID Corpus with Lip Landmark Coordinates", "## Introduction\n\nThis enhanced version of the GRID audiovisual sentence corpus, originally available at Zenodo, incorporates significant new features for auditory-visual speech recognition research. Building upon the preprocessed data from LipNet-PyTorch, we have added lip landmark coordinates to the dataset, providing detailed positional information of key points around the lips. This addition greatly enhances its utility in visual speech recognition and related fields. Furthermore, to facilitate ease of access and integration into existing machine learning workflows, we have published this enriched dataset on the Hugging Face platform, making it readily available to the research community.", "## Dataset Structure\n\nThis dataset is split into 3 directories:\n\n- 'lip_images': contains the images of the lips\n - 'speaker_id': contains the videos of a particular speaker\n - 'video_id': contains the video frames of a particular video\n - 'frame_no.jpg': jpg image of the lips of a particular frame\n- 'lip_coordinates': contains the landmark coordinates of the lips\n - 'speaker_id': contains the lip landmark of a particular speaker\n - 'video_id.json': a json file containing the lip landmark coordinates of a particular video, where the keys are the frame numbers and the values are the x, y lip landmark coordinates\n- 'GRID_alignments': contains the alignments of all the videos in the dataset\n - 'speaker_id': contains the alignments of a particular speaker\n - 'video_id.align': contains the alignments of a particular video, where each line is a word and the start and end time of the word in the video", "## Details\n\nThe lip landmark coordinates are extracted using the original videos in the GRID corpus and using the dlib library, using the shape_predictor_68_face_landmarks_GTX.dat pretrained model. The lip landmark coordinates are then saved in a json file, where the keys are the frame numbers and the values are the x, y lip landmark coordinates. The lip landmark coordinates are saved in the same order as the frames in the video.", "## Usage\n\nThe dataset can be downloaded by cloning this repository.", "### Cloning the repository", "### Loading the dataset\n\nAfter cloning the repository, you can load the dataset by unpacking the tar file and using dataset_tar.py script.\n\nAlternatively, a probably faster method is that, you can un-tar the tar files using the following command:", "## Acknowledgements\n\nAlvarez Casado, C., Bordallo Lopez, M.\nReal-time face alignment: evaluation methods, training strategies and implementation optimization.\nSpringer Journal of Real-time image processing, 2021\n\nAssael, Y., Shillingford, B., Whiteson, S., & Freitas, N. (2017). LipNet: End-to-End Sentence-level Lipreading. GPU Technology Conference.\n\nCooke, M., Barker, J., Cunningham, S., & Shao, X. (2006). The Grid Audio-Visual Speech Corpus (1.0) [Data set]. Zenodo. URL" ]
[ "TAGS\n#size_categories-10K<n<100K #language-English #license-cc-by-4.0 #region-us \n", "# Enhanced GRID Corpus with Lip Landmark Coordinates", "## Introduction\n\nThis enhanced version of the GRID audiovisual sentence corpus, originally available at Zenodo, incorporates significant new features for auditory-visual speech recognition research. Building upon the preprocessed data from LipNet-PyTorch, we have added lip landmark coordinates to the dataset, providing detailed positional information of key points around the lips. This addition greatly enhances its utility in visual speech recognition and related fields. Furthermore, to facilitate ease of access and integration into existing machine learning workflows, we have published this enriched dataset on the Hugging Face platform, making it readily available to the research community.", "## Dataset Structure\n\nThis dataset is split into 3 directories:\n\n- 'lip_images': contains the images of the lips\n - 'speaker_id': contains the videos of a particular speaker\n - 'video_id': contains the video frames of a particular video\n - 'frame_no.jpg': jpg image of the lips of a particular frame\n- 'lip_coordinates': contains the landmark coordinates of the lips\n - 'speaker_id': contains the lip landmark of a particular speaker\n - 'video_id.json': a json file containing the lip landmark coordinates of a particular video, where the keys are the frame numbers and the values are the x, y lip landmark coordinates\n- 'GRID_alignments': contains the alignments of all the videos in the dataset\n - 'speaker_id': contains the alignments of a particular speaker\n - 'video_id.align': contains the alignments of a particular video, where each line is a word and the start and end time of the word in the video", "## Details\n\nThe lip landmark coordinates are extracted using the original videos in the GRID corpus and using the dlib library, using the shape_predictor_68_face_landmarks_GTX.dat pretrained model. The lip landmark coordinates are then saved in a json file, where the keys are the frame numbers and the values are the x, y lip landmark coordinates. The lip landmark coordinates are saved in the same order as the frames in the video.", "## Usage\n\nThe dataset can be downloaded by cloning this repository.", "### Cloning the repository", "### Loading the dataset\n\nAfter cloning the repository, you can load the dataset by unpacking the tar file and using dataset_tar.py script.\n\nAlternatively, a probably faster method is that, you can un-tar the tar files using the following command:", "## Acknowledgements\n\nAlvarez Casado, C., Bordallo Lopez, M.\nReal-time face alignment: evaluation methods, training strategies and implementation optimization.\nSpringer Journal of Real-time image processing, 2021\n\nAssael, Y., Shillingford, B., Whiteson, S., & Freitas, N. (2017). LipNet: End-to-End Sentence-level Lipreading. GPU Technology Conference.\n\nCooke, M., Barker, J., Cunningham, S., & Shao, X. (2006). The Grid Audio-Visual Speech Corpus (1.0) [Data set]. Zenodo. URL" ]
[ 31, 14, 143, 250, 111, 18, 8, 61, 147 ]
[ "passage: TAGS\n#size_categories-10K<n<100K #language-English #license-cc-by-4.0 #region-us \n# Enhanced GRID Corpus with Lip Landmark Coordinates## Introduction\n\nThis enhanced version of the GRID audiovisual sentence corpus, originally available at Zenodo, incorporates significant new features for auditory-visual speech recognition research. Building upon the preprocessed data from LipNet-PyTorch, we have added lip landmark coordinates to the dataset, providing detailed positional information of key points around the lips. This addition greatly enhances its utility in visual speech recognition and related fields. Furthermore, to facilitate ease of access and integration into existing machine learning workflows, we have published this enriched dataset on the Hugging Face platform, making it readily available to the research community.## Dataset Structure\n\nThis dataset is split into 3 directories:\n\n- 'lip_images': contains the images of the lips\n - 'speaker_id': contains the videos of a particular speaker\n - 'video_id': contains the video frames of a particular video\n - 'frame_no.jpg': jpg image of the lips of a particular frame\n- 'lip_coordinates': contains the landmark coordinates of the lips\n - 'speaker_id': contains the lip landmark of a particular speaker\n - 'video_id.json': a json file containing the lip landmark coordinates of a particular video, where the keys are the frame numbers and the values are the x, y lip landmark coordinates\n- 'GRID_alignments': contains the alignments of all the videos in the dataset\n - 'speaker_id': contains the alignments of a particular speaker\n - 'video_id.align': contains the alignments of a particular video, where each line is a word and the start and end time of the word in the video" ]
fea7e4ecb072c5620125fa532be31cd96f440e6e
# Dataset Card for Evaluation run of Qwen/Qwen-72B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Qwen/Qwen-72B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Qwen/Qwen-72B](https://huggingface.co/Qwen/Qwen-72B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 62 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Qwen__Qwen-72B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-05T02:10:37.267059](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen-72B/blob/main/results_2023-12-05T02-10-37.267059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7692238060042156, "acc_stderr": 0.027807291244956196, "acc_norm": 0.7731238892784332, "acc_norm_stderr": 0.028330728981592973, "mc1": 0.42717258261933905, "mc1_stderr": 0.017316834410963933, "mc2": 0.6019109516805667, "mc2_stderr": 0.014606562783785249 }, "harness|arc:challenge|25": { "acc": 0.6220136518771331, "acc_stderr": 0.0141696645203031, "acc_norm": 0.6518771331058021, "acc_norm_stderr": 0.01392100859517935 }, "harness|hellaswag|10": { "acc": 0.6684923322047401, "acc_stderr": 0.004697929774670292, "acc_norm": 0.8593905596494722, "acc_norm_stderr": 0.0034690778470563865 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7333333333333333, "acc_stderr": 0.038201699145179055, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.038201699145179055 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8552631578947368, "acc_stderr": 0.028631951845930394, "acc_norm": 0.8552631578947368, "acc_norm_stderr": 0.028631951845930394 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.8, "acc_stderr": 0.04020151261036844, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036844 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8377358490566038, "acc_stderr": 0.022691482872035342, "acc_norm": 0.8377358490566038, "acc_norm_stderr": 0.022691482872035342 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.9375, "acc_stderr": 0.02024219611347799, "acc_norm": 0.9375, "acc_norm_stderr": 0.02024219611347799 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110175, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110175 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7803468208092486, "acc_stderr": 0.031568093627031744, "acc_norm": 0.7803468208092486, "acc_norm_stderr": 0.031568093627031744 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5, "acc_stderr": 0.04975185951049946, "acc_norm": 0.5, "acc_norm_stderr": 0.04975185951049946 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.81, "acc_stderr": 0.039427724440366234, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7872340425531915, "acc_stderr": 0.026754391348039766, "acc_norm": 0.7872340425531915, "acc_norm_stderr": 0.026754391348039766 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5964912280701754, "acc_stderr": 0.04615186962583707, "acc_norm": 0.5964912280701754, "acc_norm_stderr": 0.04615186962583707 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7862068965517242, "acc_stderr": 0.03416520447747549, "acc_norm": 0.7862068965517242, "acc_norm_stderr": 0.03416520447747549 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.6878306878306878, "acc_stderr": 0.02386520683697258, "acc_norm": 0.6878306878306878, "acc_norm_stderr": 0.02386520683697258 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5634920634920635, "acc_stderr": 0.04435932892851466, "acc_norm": 0.5634920634920635, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8935483870967742, "acc_stderr": 0.017545102951656632, "acc_norm": 0.8935483870967742, "acc_norm_stderr": 0.017545102951656632 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6551724137931034, "acc_stderr": 0.03344283744280459, "acc_norm": 0.6551724137931034, "acc_norm_stderr": 0.03344283744280459 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8545454545454545, "acc_stderr": 0.027530196355066573, "acc_norm": 0.8545454545454545, "acc_norm_stderr": 0.027530196355066573 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9343434343434344, "acc_stderr": 0.017646526677233317, "acc_norm": 0.9343434343434344, "acc_norm_stderr": 0.017646526677233317 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9896373056994818, "acc_stderr": 0.007308424386792194, "acc_norm": 0.9896373056994818, "acc_norm_stderr": 0.007308424386792194 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.8102564102564103, "acc_stderr": 0.019880165406588768, "acc_norm": 0.8102564102564103, "acc_norm_stderr": 0.019880165406588768 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.4962962962962963, "acc_stderr": 0.03048470166508437, "acc_norm": 0.4962962962962963, "acc_norm_stderr": 0.03048470166508437 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8445378151260504, "acc_stderr": 0.023536818625398904, "acc_norm": 0.8445378151260504, "acc_norm_stderr": 0.023536818625398904 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5695364238410596, "acc_stderr": 0.04042809961395634, "acc_norm": 0.5695364238410596, "acc_norm_stderr": 0.04042809961395634 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9284403669724771, "acc_stderr": 0.011051255247815476, "acc_norm": 0.9284403669724771, "acc_norm_stderr": 0.011051255247815476 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6990740740740741, "acc_stderr": 0.03128039084329883, "acc_norm": 0.6990740740740741, "acc_norm_stderr": 0.03128039084329883 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9362745098039216, "acc_stderr": 0.01714392165552496, "acc_norm": 0.9362745098039216, "acc_norm_stderr": 0.01714392165552496 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8987341772151899, "acc_stderr": 0.019637720526065505, "acc_norm": 0.8987341772151899, "acc_norm_stderr": 0.019637720526065505 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.8251121076233184, "acc_stderr": 0.025495284626444965, "acc_norm": 0.8251121076233184, "acc_norm_stderr": 0.025495284626444965 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.9007633587786259, "acc_stderr": 0.02622223517147735, "acc_norm": 0.9007633587786259, "acc_norm_stderr": 0.02622223517147735 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8925619834710744, "acc_stderr": 0.028268812192540616, "acc_norm": 0.8925619834710744, "acc_norm_stderr": 0.028268812192540616 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8333333333333334, "acc_stderr": 0.036028141763926456, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.036028141763926456 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8588957055214724, "acc_stderr": 0.027351605518389752, "acc_norm": 0.8588957055214724, "acc_norm_stderr": 0.027351605518389752 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.6160714285714286, "acc_stderr": 0.04616143075028546, "acc_norm": 0.6160714285714286, "acc_norm_stderr": 0.04616143075028546 }, "harness|hendrycksTest-management|5": { "acc": 0.8932038834951457, "acc_stderr": 0.030581088928331362, "acc_norm": 0.8932038834951457, "acc_norm_stderr": 0.030581088928331362 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9487179487179487, "acc_stderr": 0.01445018117687274, "acc_norm": 0.9487179487179487, "acc_norm_stderr": 0.01445018117687274 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.85, "acc_stderr": 0.035887028128263734, "acc_norm": 0.85, "acc_norm_stderr": 0.035887028128263734 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9169859514687101, "acc_stderr": 0.009866287394639536, "acc_norm": 0.9169859514687101, "acc_norm_stderr": 0.009866287394639536 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8410404624277457, "acc_stderr": 0.019685307033571946, "acc_norm": 0.8410404624277457, "acc_norm_stderr": 0.019685307033571946 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.6435754189944134, "acc_stderr": 0.016018239710513398, "acc_norm": 0.6435754189944134, "acc_norm_stderr": 0.016018239710513398 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8496732026143791, "acc_stderr": 0.020464175124332632, "acc_norm": 0.8496732026143791, "acc_norm_stderr": 0.020464175124332632 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8360128617363344, "acc_stderr": 0.021029576464662695, "acc_norm": 0.8360128617363344, "acc_norm_stderr": 0.021029576464662695 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8672839506172839, "acc_stderr": 0.018877353839571842, "acc_norm": 0.8672839506172839, "acc_norm_stderr": 0.018877353839571842 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6524822695035462, "acc_stderr": 0.028406627809590954, "acc_norm": 0.6524822695035462, "acc_norm_stderr": 0.028406627809590954 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.6127770534550195, "acc_stderr": 0.012441155326854931, "acc_norm": 0.6127770534550195, "acc_norm_stderr": 0.012441155326854931 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8455882352941176, "acc_stderr": 0.021950024722922033, "acc_norm": 0.8455882352941176, "acc_norm_stderr": 0.021950024722922033 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.8235294117647058, "acc_stderr": 0.015422512066262552, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.015422512066262552 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7363636363636363, "acc_stderr": 0.04220224692971987, "acc_norm": 0.7363636363636363, "acc_norm_stderr": 0.04220224692971987 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8244897959183674, "acc_stderr": 0.02435280072297001, "acc_norm": 0.8244897959183674, "acc_norm_stderr": 0.02435280072297001 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8955223880597015, "acc_stderr": 0.021628920516700643, "acc_norm": 0.8955223880597015, "acc_norm_stderr": 0.021628920516700643 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.95, "acc_stderr": 0.021904291355759033, "acc_norm": 0.95, "acc_norm_stderr": 0.021904291355759033 }, "harness|hendrycksTest-virology|5": { "acc": 0.5843373493975904, "acc_stderr": 0.03836722176598053, "acc_norm": 0.5843373493975904, "acc_norm_stderr": 0.03836722176598053 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8888888888888888, "acc_stderr": 0.024103384202072864, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.024103384202072864 }, "harness|truthfulqa:mc|0": { "mc1": 0.42717258261933905, "mc1_stderr": 0.017316834410963933, "mc2": 0.6019109516805667, "mc2_stderr": 0.014606562783785249 }, "harness|winogrande|5": { "acc": 0.824782951854775, "acc_stderr": 0.010684179227706177 }, "harness|gsm8k|5": { "acc": 0.7043214556482184, "acc_stderr": 0.012570068947898772 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Qwen__Qwen-72B
[ "region:us" ]
2023-12-09T13:58:50+00:00
{"pretty_name": "Evaluation run of Qwen/Qwen-72B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Qwen/Qwen-72B](https://huggingface.co/Qwen/Qwen-72B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 62 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Qwen__Qwen-72B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-05T02:10:37.267059](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen-72B/blob/main/results_2023-12-05T02-10-37.267059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7692238060042156,\n \"acc_stderr\": 0.027807291244956196,\n \"acc_norm\": 0.7731238892784332,\n \"acc_norm_stderr\": 0.028330728981592973,\n \"mc1\": 0.42717258261933905,\n \"mc1_stderr\": 0.017316834410963933,\n \"mc2\": 0.6019109516805667,\n \"mc2_stderr\": 0.014606562783785249\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.0141696645203031,\n \"acc_norm\": 0.6518771331058021,\n \"acc_norm_stderr\": 0.01392100859517935\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6684923322047401,\n \"acc_stderr\": 0.004697929774670292,\n \"acc_norm\": 0.8593905596494722,\n \"acc_norm_stderr\": 0.0034690778470563865\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8552631578947368,\n \"acc_stderr\": 0.028631951845930394,\n \"acc_norm\": 0.8552631578947368,\n \"acc_norm_stderr\": 0.028631951845930394\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8377358490566038,\n \"acc_stderr\": 0.022691482872035342,\n \"acc_norm\": 0.8377358490566038,\n \"acc_norm_stderr\": 0.022691482872035342\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9375,\n \"acc_stderr\": 0.02024219611347799,\n \"acc_norm\": 0.9375,\n \"acc_norm_stderr\": 0.02024219611347799\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110175,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110175\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04975185951049946,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04975185951049946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7872340425531915,\n \"acc_stderr\": 0.026754391348039766,\n \"acc_norm\": 0.7872340425531915,\n \"acc_norm_stderr\": 0.026754391348039766\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.03416520447747549,\n \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.03416520447747549\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6878306878306878,\n \"acc_stderr\": 0.02386520683697258,\n \"acc_norm\": 0.6878306878306878,\n \"acc_norm_stderr\": 0.02386520683697258\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8935483870967742,\n \"acc_stderr\": 0.017545102951656632,\n \"acc_norm\": 0.8935483870967742,\n \"acc_norm_stderr\": 0.017545102951656632\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03344283744280459,\n \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03344283744280459\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9343434343434344,\n \"acc_stderr\": 0.017646526677233317,\n \"acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.017646526677233317\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9896373056994818,\n \"acc_stderr\": 0.007308424386792194,\n \"acc_norm\": 0.9896373056994818,\n \"acc_norm_stderr\": 0.007308424386792194\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.019880165406588768,\n \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.019880165406588768\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4962962962962963,\n \"acc_stderr\": 0.03048470166508437,\n \"acc_norm\": 0.4962962962962963,\n \"acc_norm_stderr\": 0.03048470166508437\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5695364238410596,\n \"acc_stderr\": 0.04042809961395634,\n \"acc_norm\": 0.5695364238410596,\n \"acc_norm_stderr\": 0.04042809961395634\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9284403669724771,\n \"acc_stderr\": 0.011051255247815476,\n \"acc_norm\": 0.9284403669724771,\n \"acc_norm_stderr\": 0.011051255247815476\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6990740740740741,\n \"acc_stderr\": 0.03128039084329883,\n \"acc_norm\": 0.6990740740740741,\n \"acc_norm_stderr\": 0.03128039084329883\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9362745098039216,\n \"acc_stderr\": 0.01714392165552496,\n \"acc_norm\": 0.9362745098039216,\n \"acc_norm_stderr\": 0.01714392165552496\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065505,\n \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065505\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8251121076233184,\n \"acc_stderr\": 0.025495284626444965,\n \"acc_norm\": 0.8251121076233184,\n \"acc_norm_stderr\": 0.025495284626444965\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9007633587786259,\n \"acc_stderr\": 0.02622223517147735,\n \"acc_norm\": 0.9007633587786259,\n \"acc_norm_stderr\": 0.02622223517147735\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540616,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540616\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8588957055214724,\n \"acc_stderr\": 0.027351605518389752,\n \"acc_norm\": 0.8588957055214724,\n \"acc_norm_stderr\": 0.027351605518389752\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331362,\n \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331362\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9487179487179487,\n \"acc_stderr\": 0.01445018117687274,\n \"acc_norm\": 0.9487179487179487,\n \"acc_norm_stderr\": 0.01445018117687274\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9169859514687101,\n \"acc_stderr\": 0.009866287394639536,\n \"acc_norm\": 0.9169859514687101,\n \"acc_norm_stderr\": 0.009866287394639536\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8410404624277457,\n \"acc_stderr\": 0.019685307033571946,\n \"acc_norm\": 0.8410404624277457,\n \"acc_norm_stderr\": 0.019685307033571946\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6435754189944134,\n \"acc_stderr\": 0.016018239710513398,\n \"acc_norm\": 0.6435754189944134,\n \"acc_norm_stderr\": 0.016018239710513398\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.020464175124332632,\n \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.020464175124332632\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8360128617363344,\n \"acc_stderr\": 0.021029576464662695,\n \"acc_norm\": 0.8360128617363344,\n \"acc_norm_stderr\": 0.021029576464662695\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.018877353839571842,\n \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.018877353839571842\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6524822695035462,\n \"acc_stderr\": 0.028406627809590954,\n \"acc_norm\": 0.6524822695035462,\n \"acc_norm_stderr\": 0.028406627809590954\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6127770534550195,\n \"acc_stderr\": 0.012441155326854931,\n \"acc_norm\": 0.6127770534550195,\n \"acc_norm_stderr\": 0.012441155326854931\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8455882352941176,\n \"acc_stderr\": 0.021950024722922033,\n \"acc_norm\": 0.8455882352941176,\n \"acc_norm_stderr\": 0.021950024722922033\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.015422512066262552,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.015422512066262552\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8244897959183674,\n \"acc_stderr\": 0.02435280072297001,\n \"acc_norm\": 0.8244897959183674,\n \"acc_norm_stderr\": 0.02435280072297001\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759033,\n \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759033\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42717258261933905,\n \"mc1_stderr\": 0.017316834410963933,\n \"mc2\": 0.6019109516805667,\n \"mc2_stderr\": 0.014606562783785249\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.824782951854775,\n \"acc_stderr\": 0.010684179227706177\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7043214556482184,\n \"acc_stderr\": 0.012570068947898772\n }\n}\n```", "repo_url": "https://huggingface.co/Qwen/Qwen-72B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|arc:challenge|25_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|gsm8k|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hellaswag|10_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T02-10-37.267059.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T02-10-37.267059.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_05T02_10_37.267059", "path": ["**/details_harness|winogrande|5_2023-12-05T02-10-37.267059.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-05T02-10-37.267059.parquet"]}]}]}
2023-12-09T14:27:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Qwen/Qwen-72B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Qwen/Qwen-72B on the Open LLM Leaderboard. The dataset is composed of 62 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-05T02:10:37.267059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Qwen/Qwen-72B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Qwen/Qwen-72B on the Open LLM Leaderboard.\n\nThe dataset is composed of 62 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-05T02:10:37.267059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Qwen/Qwen-72B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Qwen/Qwen-72B on the Open LLM Leaderboard.\n\nThe dataset is composed of 62 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-05T02:10:37.267059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 17, 31, 166, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Qwen/Qwen-72B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Qwen/Qwen-72B on the Open LLM Leaderboard.\n\nThe dataset is composed of 62 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-05T02:10:37.267059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ce59ad1fdcd035e5be0021e173384c31dcb1e8d7
# Dataset Card for Evaluation run of webbigdata/ALMA-7B-Ja-V2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/webbigdata/ALMA-7B-Ja-V2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [webbigdata/ALMA-7B-Ja-V2](https://huggingface.co/webbigdata/ALMA-7B-Ja-V2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_webbigdata__ALMA-7B-Ja-V2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T14:45:04.238989](https://huggingface.co/datasets/open-llm-leaderboard/details_webbigdata__ALMA-7B-Ja-V2/blob/main/results_2023-12-09T14-45-04.238989.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.44789895292493354, "acc_stderr": 0.034270064219110566, "acc_norm": 0.45411301041384805, "acc_norm_stderr": 0.035207954754141714, "mc1": 0.2533659730722154, "mc1_stderr": 0.015225899340826845, "mc2": 0.3865901488087726, "mc2_stderr": 0.014093311661436469 }, "harness|arc:challenge|25": { "acc": 0.5102389078498294, "acc_stderr": 0.014608326906285012, "acc_norm": 0.5238907849829352, "acc_norm_stderr": 0.014594701798071654 }, "harness|hellaswag|10": { "acc": 0.5880302728540131, "acc_stderr": 0.004911837730582204, "acc_norm": 0.7792272455686118, "acc_norm_stderr": 0.004139199120463524 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.45925925925925926, "acc_stderr": 0.04304979692464242, "acc_norm": 0.45925925925925926, "acc_norm_stderr": 0.04304979692464242 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.39473684210526316, "acc_stderr": 0.039777499346220734, "acc_norm": 0.39473684210526316, "acc_norm_stderr": 0.039777499346220734 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.47547169811320755, "acc_stderr": 0.030735822206205608, "acc_norm": 0.47547169811320755, "acc_norm_stderr": 0.030735822206205608 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4236111111111111, "acc_stderr": 0.0413212501972337, "acc_norm": 0.4236111111111111, "acc_norm_stderr": 0.0413212501972337 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939098, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939098 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.43352601156069365, "acc_stderr": 0.03778621079092056, "acc_norm": 0.43352601156069365, "acc_norm_stderr": 0.03778621079092056 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.22549019607843138, "acc_stderr": 0.041583075330832865, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.041583075330832865 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4297872340425532, "acc_stderr": 0.03236214467715563, "acc_norm": 0.4297872340425532, "acc_norm_stderr": 0.03236214467715563 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.32456140350877194, "acc_stderr": 0.04404556157374767, "acc_norm": 0.32456140350877194, "acc_norm_stderr": 0.04404556157374767 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.42758620689655175, "acc_stderr": 0.04122737111370331, "acc_norm": 0.42758620689655175, "acc_norm_stderr": 0.04122737111370331 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2830687830687831, "acc_stderr": 0.023201392938194974, "acc_norm": 0.2830687830687831, "acc_norm_stderr": 0.023201392938194974 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2619047619047619, "acc_stderr": 0.03932537680392871, "acc_norm": 0.2619047619047619, "acc_norm_stderr": 0.03932537680392871 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.45806451612903226, "acc_stderr": 0.028343787250540636, "acc_norm": 0.45806451612903226, "acc_norm_stderr": 0.028343787250540636 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.31527093596059114, "acc_stderr": 0.03269080871970187, "acc_norm": 0.31527093596059114, "acc_norm_stderr": 0.03269080871970187 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5757575757575758, "acc_stderr": 0.03859268142070264, "acc_norm": 0.5757575757575758, "acc_norm_stderr": 0.03859268142070264 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.4898989898989899, "acc_stderr": 0.03561625488673745, "acc_norm": 0.4898989898989899, "acc_norm_stderr": 0.03561625488673745 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6424870466321243, "acc_stderr": 0.034588160421810114, "acc_norm": 0.6424870466321243, "acc_norm_stderr": 0.034588160421810114 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.382051282051282, "acc_stderr": 0.024635549163908227, "acc_norm": 0.382051282051282, "acc_norm_stderr": 0.024635549163908227 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.27037037037037037, "acc_stderr": 0.027080372815145675, "acc_norm": 0.27037037037037037, "acc_norm_stderr": 0.027080372815145675 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3865546218487395, "acc_stderr": 0.03163145807552379, "acc_norm": 0.3865546218487395, "acc_norm_stderr": 0.03163145807552379 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2781456953642384, "acc_stderr": 0.03658603262763743, "acc_norm": 0.2781456953642384, "acc_norm_stderr": 0.03658603262763743 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6201834862385321, "acc_stderr": 0.020808825617866244, "acc_norm": 0.6201834862385321, "acc_norm_stderr": 0.020808825617866244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.2222222222222222, "acc_stderr": 0.028353212866863445, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.028353212866863445 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.4950980392156863, "acc_stderr": 0.035091433756067866, "acc_norm": 0.4950980392156863, "acc_norm_stderr": 0.035091433756067866 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.540084388185654, "acc_stderr": 0.03244246810187914, "acc_norm": 0.540084388185654, "acc_norm_stderr": 0.03244246810187914 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5605381165919282, "acc_stderr": 0.03331092511038179, "acc_norm": 0.5605381165919282, "acc_norm_stderr": 0.03331092511038179 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5343511450381679, "acc_stderr": 0.043749285605997376, "acc_norm": 0.5343511450381679, "acc_norm_stderr": 0.043749285605997376 }, "harness|hendrycksTest-international_law|5": { "acc": 0.628099173553719, "acc_stderr": 0.04412015806624505, "acc_norm": 0.628099173553719, "acc_norm_stderr": 0.04412015806624505 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.4351851851851852, "acc_stderr": 0.04792898170907062, "acc_norm": 0.4351851851851852, "acc_norm_stderr": 0.04792898170907062 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.44171779141104295, "acc_stderr": 0.03901591825836183, "acc_norm": 0.44171779141104295, "acc_norm_stderr": 0.03901591825836183 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.36607142857142855, "acc_stderr": 0.0457237235873743, "acc_norm": 0.36607142857142855, "acc_norm_stderr": 0.0457237235873743 }, "harness|hendrycksTest-management|5": { "acc": 0.5145631067961165, "acc_stderr": 0.049486373240266356, "acc_norm": 0.5145631067961165, "acc_norm_stderr": 0.049486373240266356 }, "harness|hendrycksTest-marketing|5": { "acc": 0.6923076923076923, "acc_stderr": 0.030236389942173078, "acc_norm": 0.6923076923076923, "acc_norm_stderr": 0.030236389942173078 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6372924648786717, "acc_stderr": 0.017192708674602306, "acc_norm": 0.6372924648786717, "acc_norm_stderr": 0.017192708674602306 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5173410404624278, "acc_stderr": 0.02690290045866664, "acc_norm": 0.5173410404624278, "acc_norm_stderr": 0.02690290045866664 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.45751633986928103, "acc_stderr": 0.028526383452142635, "acc_norm": 0.45751633986928103, "acc_norm_stderr": 0.028526383452142635 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.594855305466238, "acc_stderr": 0.027882383791325956, "acc_norm": 0.594855305466238, "acc_norm_stderr": 0.027882383791325956 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.4722222222222222, "acc_stderr": 0.027777777777777804, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.027777777777777804 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.35815602836879434, "acc_stderr": 0.028602085862759426, "acc_norm": 0.35815602836879434, "acc_norm_stderr": 0.028602085862759426 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3539765319426336, "acc_stderr": 0.012213504731731634, "acc_norm": 0.3539765319426336, "acc_norm_stderr": 0.012213504731731634 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4852941176470588, "acc_stderr": 0.03035969707904612, "acc_norm": 0.4852941176470588, "acc_norm_stderr": 0.03035969707904612 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4215686274509804, "acc_stderr": 0.01997742260022747, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.01997742260022747 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.4818181818181818, "acc_stderr": 0.04785964010794917, "acc_norm": 0.4818181818181818, "acc_norm_stderr": 0.04785964010794917 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.49387755102040815, "acc_stderr": 0.03200682020163909, "acc_norm": 0.49387755102040815, "acc_norm_stderr": 0.03200682020163909 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6417910447761194, "acc_stderr": 0.03390393042268813, "acc_norm": 0.6417910447761194, "acc_norm_stderr": 0.03390393042268813 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.67, "acc_stderr": 0.047258156262526094, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526094 }, "harness|hendrycksTest-virology|5": { "acc": 0.4036144578313253, "acc_stderr": 0.03819486140758399, "acc_norm": 0.4036144578313253, "acc_norm_stderr": 0.03819486140758399 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6842105263157895, "acc_stderr": 0.03565079670708313, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.03565079670708313 }, "harness|truthfulqa:mc|0": { "mc1": 0.2533659730722154, "mc1_stderr": 0.015225899340826845, "mc2": 0.3865901488087726, "mc2_stderr": 0.014093311661436469 }, "harness|winogrande|5": { "acc": 0.734017363851618, "acc_stderr": 0.012418323153051048 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_webbigdata__ALMA-7B-Ja-V2
[ "region:us" ]
2023-12-09T14:48:02+00:00
{"pretty_name": "Evaluation run of webbigdata/ALMA-7B-Ja-V2", "dataset_summary": "Dataset automatically created during the evaluation run of model [webbigdata/ALMA-7B-Ja-V2](https://huggingface.co/webbigdata/ALMA-7B-Ja-V2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_webbigdata__ALMA-7B-Ja-V2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T14:45:04.238989](https://huggingface.co/datasets/open-llm-leaderboard/details_webbigdata__ALMA-7B-Ja-V2/blob/main/results_2023-12-09T14-45-04.238989.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.44789895292493354,\n \"acc_stderr\": 0.034270064219110566,\n \"acc_norm\": 0.45411301041384805,\n \"acc_norm_stderr\": 0.035207954754141714,\n \"mc1\": 0.2533659730722154,\n \"mc1_stderr\": 0.015225899340826845,\n \"mc2\": 0.3865901488087726,\n \"mc2_stderr\": 0.014093311661436469\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5102389078498294,\n \"acc_stderr\": 0.014608326906285012,\n \"acc_norm\": 0.5238907849829352,\n \"acc_norm_stderr\": 0.014594701798071654\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5880302728540131,\n \"acc_stderr\": 0.004911837730582204,\n \"acc_norm\": 0.7792272455686118,\n \"acc_norm_stderr\": 0.004139199120463524\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.47547169811320755,\n \"acc_stderr\": 0.030735822206205608,\n \"acc_norm\": 0.47547169811320755,\n \"acc_norm_stderr\": 0.030735822206205608\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n \"acc_stderr\": 0.0413212501972337,\n \"acc_norm\": 0.4236111111111111,\n \"acc_norm_stderr\": 0.0413212501972337\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.43352601156069365,\n \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715563,\n \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370331,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370331\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194974,\n \"acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194974\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.45806451612903226,\n \"acc_stderr\": 0.028343787250540636,\n \"acc_norm\": 0.45806451612903226,\n \"acc_norm_stderr\": 0.028343787250540636\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970187,\n \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970187\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.03859268142070264,\n \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03859268142070264\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4898989898989899,\n \"acc_stderr\": 0.03561625488673745,\n \"acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.03561625488673745\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6424870466321243,\n \"acc_stderr\": 0.034588160421810114,\n \"acc_norm\": 0.6424870466321243,\n \"acc_norm_stderr\": 0.034588160421810114\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.382051282051282,\n \"acc_stderr\": 0.024635549163908227,\n \"acc_norm\": 0.382051282051282,\n \"acc_norm_stderr\": 0.024635549163908227\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145675,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145675\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3865546218487395,\n \"acc_stderr\": 0.03163145807552379,\n \"acc_norm\": 0.3865546218487395,\n \"acc_norm_stderr\": 0.03163145807552379\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6201834862385321,\n \"acc_stderr\": 0.020808825617866244,\n \"acc_norm\": 0.6201834862385321,\n \"acc_norm_stderr\": 0.020808825617866244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.028353212866863445,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.028353212866863445\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4950980392156863,\n \"acc_stderr\": 0.035091433756067866,\n \"acc_norm\": 0.4950980392156863,\n \"acc_norm_stderr\": 0.035091433756067866\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.540084388185654,\n \"acc_stderr\": 0.03244246810187914,\n \"acc_norm\": 0.540084388185654,\n \"acc_norm_stderr\": 0.03244246810187914\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5605381165919282,\n \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.5605381165919282,\n \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.043749285605997376,\n \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.043749285605997376\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.04412015806624505,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624505\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.44171779141104295,\n \"acc_stderr\": 0.03901591825836183,\n \"acc_norm\": 0.44171779141104295,\n \"acc_norm_stderr\": 0.03901591825836183\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5145631067961165,\n \"acc_stderr\": 0.049486373240266356,\n \"acc_norm\": 0.5145631067961165,\n \"acc_norm_stderr\": 0.049486373240266356\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.030236389942173078,\n \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.030236389942173078\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6372924648786717,\n \"acc_stderr\": 0.017192708674602306,\n \"acc_norm\": 0.6372924648786717,\n \"acc_norm_stderr\": 0.017192708674602306\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.02690290045866664,\n \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.02690290045866664\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.45751633986928103,\n \"acc_stderr\": 0.028526383452142635,\n \"acc_norm\": 0.45751633986928103,\n \"acc_norm_stderr\": 0.028526383452142635\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n \"acc_stderr\": 0.027882383791325956,\n \"acc_norm\": 0.594855305466238,\n \"acc_norm_stderr\": 0.027882383791325956\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.027777777777777804,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.027777777777777804\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35815602836879434,\n \"acc_stderr\": 0.028602085862759426,\n \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.028602085862759426\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3539765319426336,\n \"acc_stderr\": 0.012213504731731634,\n \"acc_norm\": 0.3539765319426336,\n \"acc_norm_stderr\": 0.012213504731731634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904612,\n \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904612\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.01997742260022747,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.01997742260022747\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4818181818181818,\n \"acc_stderr\": 0.04785964010794917,\n \"acc_norm\": 0.4818181818181818,\n \"acc_norm_stderr\": 0.04785964010794917\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.49387755102040815,\n \"acc_stderr\": 0.03200682020163909,\n \"acc_norm\": 0.49387755102040815,\n \"acc_norm_stderr\": 0.03200682020163909\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n \"acc_stderr\": 0.03390393042268813,\n \"acc_norm\": 0.6417910447761194,\n \"acc_norm_stderr\": 0.03390393042268813\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n \"acc_stderr\": 0.03819486140758399,\n \"acc_norm\": 0.4036144578313253,\n \"acc_norm_stderr\": 0.03819486140758399\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708313,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708313\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n \"mc1_stderr\": 0.015225899340826845,\n \"mc2\": 0.3865901488087726,\n \"mc2_stderr\": 0.014093311661436469\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.734017363851618,\n \"acc_stderr\": 0.012418323153051048\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/webbigdata/ALMA-7B-Ja-V2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|arc:challenge|25_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|gsm8k|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hellaswag|10_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T14-45-04.238989.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["**/details_harness|winogrande|5_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T14-45-04.238989.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T14_45_04.238989", "path": ["results_2023-12-09T14-45-04.238989.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T14-45-04.238989.parquet"]}]}]}
2023-12-09T14:48:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of webbigdata/ALMA-7B-Ja-V2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model webbigdata/ALMA-7B-Ja-V2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T14:45:04.238989(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of webbigdata/ALMA-7B-Ja-V2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model webbigdata/ALMA-7B-Ja-V2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T14:45:04.238989(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of webbigdata/ALMA-7B-Ja-V2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model webbigdata/ALMA-7B-Ja-V2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T14:45:04.238989(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of webbigdata/ALMA-7B-Ja-V2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model webbigdata/ALMA-7B-Ja-V2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T14:45:04.238989(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
3ccebf45e265293d19f6b4a2058ee4304e975ffd
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
rtp-gcp/icy_bridge_data
[ "region:us" ]
2023-12-09T14:52:03+00:00
{}
2023-12-09T14:53:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ 6, 34, 4, 40, 29, 3, 4, 9, 6, 5, 7, 4, 7, 10, 9, 5, 9, 8, 10, 46, 8, 7, 10, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact" ]
e33cbe5acf072fed6e4e37ab8d3e5207ce0df182
# Dataset Card for Evaluation run of haoranxu/ALMA-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/haoranxu/ALMA-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [haoranxu/ALMA-7B](https://huggingface.co/haoranxu/ALMA-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_haoranxu__ALMA-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T14:49:25.025957](https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-7B/blob/main/results_2023-12-09T14-49-25.025957.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.3842750602900277, "acc_stderr": 0.0337414380010653, "acc_norm": 0.388832150301939, "acc_norm_stderr": 0.03466152745310896, "mc1": 0.2350061199510404, "mc1_stderr": 0.014843061507731613, "mc2": 0.3564384771875291, "mc2_stderr": 0.013567943486529975 }, "harness|arc:challenge|25": { "acc": 0.47013651877133106, "acc_stderr": 0.0145853058400071, "acc_norm": 0.5034129692832765, "acc_norm_stderr": 0.014611050403244081 }, "harness|hellaswag|10": { "acc": 0.5642302330213105, "acc_stderr": 0.004948439229523914, "acc_norm": 0.7550288787094205, "acc_norm_stderr": 0.004291911350430712 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.42962962962962964, "acc_stderr": 0.04276349494376599, "acc_norm": 0.42962962962962964, "acc_norm_stderr": 0.04276349494376599 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3618421052631579, "acc_stderr": 0.03910525752849724, "acc_norm": 0.3618421052631579, "acc_norm_stderr": 0.03910525752849724 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.37358490566037733, "acc_stderr": 0.02977308271331987, "acc_norm": 0.37358490566037733, "acc_norm_stderr": 0.02977308271331987 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.3541666666666667, "acc_stderr": 0.039994111357535424, "acc_norm": 0.3541666666666667, "acc_norm_stderr": 0.039994111357535424 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816508, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816508 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.3468208092485549, "acc_stderr": 0.036291466701596636, "acc_norm": 0.3468208092485549, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.18627450980392157, "acc_stderr": 0.038739587141493524, "acc_norm": 0.18627450980392157, "acc_norm_stderr": 0.038739587141493524 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3872340425531915, "acc_stderr": 0.03184389265339525, "acc_norm": 0.3872340425531915, "acc_norm_stderr": 0.03184389265339525 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.04185774424022056, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.04185774424022056 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4, "acc_stderr": 0.04082482904638629, "acc_norm": 0.4, "acc_norm_stderr": 0.04082482904638629 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.26455026455026454, "acc_stderr": 0.022717467897708617, "acc_norm": 0.26455026455026454, "acc_norm_stderr": 0.022717467897708617 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.19047619047619047, "acc_stderr": 0.035122074123020514, "acc_norm": 0.19047619047619047, "acc_norm_stderr": 0.035122074123020514 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.36129032258064514, "acc_stderr": 0.027327548447957543, "acc_norm": 0.36129032258064514, "acc_norm_stderr": 0.027327548447957543 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.33497536945812806, "acc_stderr": 0.0332085274234831, "acc_norm": 0.33497536945812806, "acc_norm_stderr": 0.0332085274234831 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.43636363636363634, "acc_stderr": 0.03872592983524753, "acc_norm": 0.43636363636363634, "acc_norm_stderr": 0.03872592983524753 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.398989898989899, "acc_stderr": 0.03488901616852731, "acc_norm": 0.398989898989899, "acc_norm_stderr": 0.03488901616852731 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.48186528497409326, "acc_stderr": 0.036060650018329185, "acc_norm": 0.48186528497409326, "acc_norm_stderr": 0.036060650018329185 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.34615384615384615, "acc_stderr": 0.024121125416941187, "acc_norm": 0.34615384615384615, "acc_norm_stderr": 0.024121125416941187 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2777777777777778, "acc_stderr": 0.027309140588230186, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.027309140588230186 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.31512605042016806, "acc_stderr": 0.03017680828897434, "acc_norm": 0.31512605042016806, "acc_norm_stderr": 0.03017680828897434 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2582781456953642, "acc_stderr": 0.035737053147634576, "acc_norm": 0.2582781456953642, "acc_norm_stderr": 0.035737053147634576 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.48440366972477067, "acc_stderr": 0.02142689153920805, "acc_norm": 0.48440366972477067, "acc_norm_stderr": 0.02142689153920805 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.20833333333333334, "acc_stderr": 0.027696910713093936, "acc_norm": 0.20833333333333334, "acc_norm_stderr": 0.027696910713093936 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.4362745098039216, "acc_stderr": 0.03480693138457039, "acc_norm": 0.4362745098039216, "acc_norm_stderr": 0.03480693138457039 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.48523206751054854, "acc_stderr": 0.032533028078777386, "acc_norm": 0.48523206751054854, "acc_norm_stderr": 0.032533028078777386 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.4798206278026906, "acc_stderr": 0.033530461674123, "acc_norm": 0.4798206278026906, "acc_norm_stderr": 0.033530461674123 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.3969465648854962, "acc_stderr": 0.04291135671009224, "acc_norm": 0.3969465648854962, "acc_norm_stderr": 0.04291135671009224 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5702479338842975, "acc_stderr": 0.04519082021319772, "acc_norm": 0.5702479338842975, "acc_norm_stderr": 0.04519082021319772 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.42592592592592593, "acc_stderr": 0.0478034362693679, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.0478034362693679 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.39263803680981596, "acc_stderr": 0.03836740907831029, "acc_norm": 0.39263803680981596, "acc_norm_stderr": 0.03836740907831029 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.32142857142857145, "acc_stderr": 0.044328040552915185, "acc_norm": 0.32142857142857145, "acc_norm_stderr": 0.044328040552915185 }, "harness|hendrycksTest-management|5": { "acc": 0.42718446601941745, "acc_stderr": 0.04897957737781168, "acc_norm": 0.42718446601941745, "acc_norm_stderr": 0.04897957737781168 }, "harness|hendrycksTest-marketing|5": { "acc": 0.5555555555555556, "acc_stderr": 0.03255326307272486, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.03255326307272486 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.5197956577266922, "acc_stderr": 0.017865944827291626, "acc_norm": 0.5197956577266922, "acc_norm_stderr": 0.017865944827291626 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.41040462427745666, "acc_stderr": 0.02648339204209818, "acc_norm": 0.41040462427745666, "acc_norm_stderr": 0.02648339204209818 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.38562091503267976, "acc_stderr": 0.027870745278290313, "acc_norm": 0.38562091503267976, "acc_norm_stderr": 0.027870745278290313 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5080385852090032, "acc_stderr": 0.02839442137098453, "acc_norm": 0.5080385852090032, "acc_norm_stderr": 0.02839442137098453 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.44753086419753085, "acc_stderr": 0.027667138569422704, "acc_norm": 0.44753086419753085, "acc_norm_stderr": 0.027667138569422704 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3723404255319149, "acc_stderr": 0.028838921471251458, "acc_norm": 0.3723404255319149, "acc_norm_stderr": 0.028838921471251458 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.29465449804432853, "acc_stderr": 0.011643576764069548, "acc_norm": 0.29465449804432853, "acc_norm_stderr": 0.011643576764069548 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4007352941176471, "acc_stderr": 0.029768263528933112, "acc_norm": 0.4007352941176471, "acc_norm_stderr": 0.029768263528933112 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.3937908496732026, "acc_stderr": 0.019766211991073056, "acc_norm": 0.3937908496732026, "acc_norm_stderr": 0.019766211991073056 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.44545454545454544, "acc_stderr": 0.047605488214603246, "acc_norm": 0.44545454545454544, "acc_norm_stderr": 0.047605488214603246 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.30612244897959184, "acc_stderr": 0.02950489645459596, "acc_norm": 0.30612244897959184, "acc_norm_stderr": 0.02950489645459596 }, "harness|hendrycksTest-sociology|5": { "acc": 0.4527363184079602, "acc_stderr": 0.03519702717576915, "acc_norm": 0.4527363184079602, "acc_norm_stderr": 0.03519702717576915 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-virology|5": { "acc": 0.3313253012048193, "acc_stderr": 0.03664314777288085, "acc_norm": 0.3313253012048193, "acc_norm_stderr": 0.03664314777288085 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.5847953216374269, "acc_stderr": 0.037792759455032014, "acc_norm": 0.5847953216374269, "acc_norm_stderr": 0.037792759455032014 }, "harness|truthfulqa:mc|0": { "mc1": 0.2350061199510404, "mc1_stderr": 0.014843061507731613, "mc2": 0.3564384771875291, "mc2_stderr": 0.013567943486529975 }, "harness|winogrande|5": { "acc": 0.7237569060773481, "acc_stderr": 0.012566815015698157 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_haoranxu__ALMA-7B
[ "region:us" ]
2023-12-09T14:52:20+00:00
{"pretty_name": "Evaluation run of haoranxu/ALMA-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [haoranxu/ALMA-7B](https://huggingface.co/haoranxu/ALMA-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_haoranxu__ALMA-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T14:49:25.025957](https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-7B/blob/main/results_2023-12-09T14-49-25.025957.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3842750602900277,\n \"acc_stderr\": 0.0337414380010653,\n \"acc_norm\": 0.388832150301939,\n \"acc_norm_stderr\": 0.03466152745310896,\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731613,\n \"mc2\": 0.3564384771875291,\n \"mc2_stderr\": 0.013567943486529975\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.47013651877133106,\n \"acc_stderr\": 0.0145853058400071,\n \"acc_norm\": 0.5034129692832765,\n \"acc_norm_stderr\": 0.014611050403244081\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5642302330213105,\n \"acc_stderr\": 0.004948439229523914,\n \"acc_norm\": 0.7550288787094205,\n \"acc_norm_stderr\": 0.004291911350430712\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3618421052631579,\n \"acc_stderr\": 0.03910525752849724,\n \"acc_norm\": 0.3618421052631579,\n \"acc_norm_stderr\": 0.03910525752849724\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.37358490566037733,\n \"acc_stderr\": 0.02977308271331987,\n \"acc_norm\": 0.37358490566037733,\n \"acc_norm_stderr\": 0.02977308271331987\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3541666666666667,\n \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.3541666666666667,\n \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.3468208092485549,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04082482904638629,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04082482904638629\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n \"acc_stderr\": 0.035122074123020514,\n \"acc_norm\": 0.19047619047619047,\n \"acc_norm_stderr\": 0.035122074123020514\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.36129032258064514,\n \"acc_stderr\": 0.027327548447957543,\n \"acc_norm\": 0.36129032258064514,\n \"acc_norm_stderr\": 0.027327548447957543\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.0332085274234831,\n \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.0332085274234831\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.43636363636363634,\n \"acc_stderr\": 0.03872592983524753,\n \"acc_norm\": 0.43636363636363634,\n \"acc_norm_stderr\": 0.03872592983524753\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.398989898989899,\n \"acc_stderr\": 0.03488901616852731,\n \"acc_norm\": 0.398989898989899,\n \"acc_norm_stderr\": 0.03488901616852731\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.48186528497409326,\n \"acc_stderr\": 0.036060650018329185,\n \"acc_norm\": 0.48186528497409326,\n \"acc_norm_stderr\": 0.036060650018329185\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.34615384615384615,\n \"acc_stderr\": 0.024121125416941187,\n \"acc_norm\": 0.34615384615384615,\n \"acc_norm_stderr\": 0.024121125416941187\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230186,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230186\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.31512605042016806,\n \"acc_stderr\": 0.03017680828897434,\n \"acc_norm\": 0.31512605042016806,\n \"acc_norm_stderr\": 0.03017680828897434\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.48440366972477067,\n \"acc_stderr\": 0.02142689153920805,\n \"acc_norm\": 0.48440366972477067,\n \"acc_norm_stderr\": 0.02142689153920805\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.20833333333333334,\n \"acc_stderr\": 0.027696910713093936,\n \"acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.027696910713093936\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4362745098039216,\n \"acc_stderr\": 0.03480693138457039,\n \"acc_norm\": 0.4362745098039216,\n \"acc_norm_stderr\": 0.03480693138457039\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.48523206751054854,\n \"acc_stderr\": 0.032533028078777386,\n \"acc_norm\": 0.48523206751054854,\n \"acc_norm_stderr\": 0.032533028078777386\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4798206278026906,\n \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.4798206278026906,\n \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.3969465648854962,\n \"acc_stderr\": 0.04291135671009224,\n \"acc_norm\": 0.3969465648854962,\n \"acc_norm_stderr\": 0.04291135671009224\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319772,\n \"acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319772\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.39263803680981596,\n \"acc_stderr\": 0.03836740907831029,\n \"acc_norm\": 0.39263803680981596,\n \"acc_norm_stderr\": 0.03836740907831029\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.42718446601941745,\n \"acc_stderr\": 0.04897957737781168,\n \"acc_norm\": 0.42718446601941745,\n \"acc_norm_stderr\": 0.04897957737781168\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03255326307272486,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03255326307272486\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5197956577266922,\n \"acc_stderr\": 0.017865944827291626,\n \"acc_norm\": 0.5197956577266922,\n \"acc_norm_stderr\": 0.017865944827291626\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.41040462427745666,\n \"acc_stderr\": 0.02648339204209818,\n \"acc_norm\": 0.41040462427745666,\n \"acc_norm_stderr\": 0.02648339204209818\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.38562091503267976,\n \"acc_stderr\": 0.027870745278290313,\n \"acc_norm\": 0.38562091503267976,\n \"acc_norm_stderr\": 0.027870745278290313\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5080385852090032,\n \"acc_stderr\": 0.02839442137098453,\n \"acc_norm\": 0.5080385852090032,\n \"acc_norm_stderr\": 0.02839442137098453\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.44753086419753085,\n \"acc_stderr\": 0.027667138569422704,\n \"acc_norm\": 0.44753086419753085,\n \"acc_norm_stderr\": 0.027667138569422704\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.29465449804432853,\n \"acc_stderr\": 0.011643576764069548,\n \"acc_norm\": 0.29465449804432853,\n \"acc_norm_stderr\": 0.011643576764069548\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4007352941176471,\n \"acc_stderr\": 0.029768263528933112,\n \"acc_norm\": 0.4007352941176471,\n \"acc_norm_stderr\": 0.029768263528933112\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3937908496732026,\n \"acc_stderr\": 0.019766211991073056,\n \"acc_norm\": 0.3937908496732026,\n \"acc_norm_stderr\": 0.019766211991073056\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.44545454545454544,\n \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.44545454545454544,\n \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.30612244897959184,\n \"acc_stderr\": 0.02950489645459596,\n \"acc_norm\": 0.30612244897959184,\n \"acc_norm_stderr\": 0.02950489645459596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4527363184079602,\n \"acc_stderr\": 0.03519702717576915,\n \"acc_norm\": 0.4527363184079602,\n \"acc_norm_stderr\": 0.03519702717576915\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n \"acc_stderr\": 0.03664314777288085,\n \"acc_norm\": 0.3313253012048193,\n \"acc_norm_stderr\": 0.03664314777288085\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5847953216374269,\n \"acc_stderr\": 0.037792759455032014,\n \"acc_norm\": 0.5847953216374269,\n \"acc_norm_stderr\": 0.037792759455032014\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731613,\n \"mc2\": 0.3564384771875291,\n \"mc2_stderr\": 0.013567943486529975\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7237569060773481,\n \"acc_stderr\": 0.012566815015698157\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/haoranxu/ALMA-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|arc:challenge|25_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|gsm8k|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hellaswag|10_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T14-49-25.025957.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["**/details_harness|winogrande|5_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T14-49-25.025957.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T14_49_25.025957", "path": ["results_2023-12-09T14-49-25.025957.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T14-49-25.025957.parquet"]}]}]}
2023-12-09T14:53:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of haoranxu/ALMA-7B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model haoranxu/ALMA-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T14:49:25.025957(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of haoranxu/ALMA-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model haoranxu/ALMA-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T14:49:25.025957(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of haoranxu/ALMA-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model haoranxu/ALMA-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T14:49:25.025957(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 16, 31, 165, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of haoranxu/ALMA-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model haoranxu/ALMA-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T14:49:25.025957(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
2c34019db257d89144b118631015c64b89766532
# Dataset Card for Evaluation run of u-chom/ex-llm-e1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/u-chom/ex-llm-e1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [u-chom/ex-llm-e1](https://huggingface.co/u-chom/ex-llm-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_u-chom__ex-llm-e1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T14:50:53.053467](https://huggingface.co/datasets/open-llm-leaderboard/details_u-chom__ex-llm-e1/blob/main/results_2023-12-09T14-50-53.053467.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.39402023545271236, "acc_stderr": 0.03431633094943852, "acc_norm": 0.3992950895868925, "acc_norm_stderr": 0.03515648307530415, "mc1": 0.2631578947368421, "mc1_stderr": 0.015415241740237009, "mc2": 0.4200995329344425, "mc2_stderr": 0.01434315654117436 }, "harness|arc:challenge|25": { "acc": 0.35921501706484643, "acc_stderr": 0.014020224155839159, "acc_norm": 0.3993174061433447, "acc_norm_stderr": 0.014312094557946698 }, "harness|hellaswag|10": { "acc": 0.5060744871539534, "acc_stderr": 0.004989413158034801, "acc_norm": 0.6811392152957578, "acc_norm_stderr": 0.004650825168905203 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.044619604333847415, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847415 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.45925925925925926, "acc_stderr": 0.04304979692464242, "acc_norm": 0.45925925925925926, "acc_norm_stderr": 0.04304979692464242 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.46710526315789475, "acc_stderr": 0.04060127035236397, "acc_norm": 0.46710526315789475, "acc_norm_stderr": 0.04060127035236397 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.41509433962264153, "acc_stderr": 0.03032594578928611, "acc_norm": 0.41509433962264153, "acc_norm_stderr": 0.03032594578928611 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.3680555555555556, "acc_stderr": 0.040329990539607195, "acc_norm": 0.3680555555555556, "acc_norm_stderr": 0.040329990539607195 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.31, "acc_stderr": 0.04648231987117317, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117317 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.3699421965317919, "acc_stderr": 0.036812296333943194, "acc_norm": 0.3699421965317919, "acc_norm_stderr": 0.036812296333943194 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.33191489361702126, "acc_stderr": 0.030783736757745643, "acc_norm": 0.33191489361702126, "acc_norm_stderr": 0.030783736757745643 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322004, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322004 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.3793103448275862, "acc_stderr": 0.040434618619167466, "acc_norm": 0.3793103448275862, "acc_norm_stderr": 0.040434618619167466 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2830687830687831, "acc_stderr": 0.023201392938194974, "acc_norm": 0.2830687830687831, "acc_norm_stderr": 0.023201392938194974 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3253968253968254, "acc_stderr": 0.041905964388711366, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.041905964388711366 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3709677419354839, "acc_stderr": 0.027480541887953593, "acc_norm": 0.3709677419354839, "acc_norm_stderr": 0.027480541887953593 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3251231527093596, "acc_stderr": 0.032957975663112704, "acc_norm": 0.3251231527093596, "acc_norm_stderr": 0.032957975663112704 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.4727272727272727, "acc_stderr": 0.0389853160557942, "acc_norm": 0.4727272727272727, "acc_norm_stderr": 0.0389853160557942 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.4444444444444444, "acc_stderr": 0.035402943770953675, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.035402943770953675 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.5233160621761658, "acc_stderr": 0.03604513672442202, "acc_norm": 0.5233160621761658, "acc_norm_stderr": 0.03604513672442202 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.31025641025641026, "acc_stderr": 0.02345467488940429, "acc_norm": 0.31025641025641026, "acc_norm_stderr": 0.02345467488940429 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24444444444444444, "acc_stderr": 0.02620276653465215, "acc_norm": 0.24444444444444444, "acc_norm_stderr": 0.02620276653465215 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3067226890756303, "acc_stderr": 0.02995382389188705, "acc_norm": 0.3067226890756303, "acc_norm_stderr": 0.02995382389188705 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658753, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658753 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.44220183486238535, "acc_stderr": 0.02129361320752021, "acc_norm": 0.44220183486238535, "acc_norm_stderr": 0.02129361320752021 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3287037037037037, "acc_stderr": 0.03203614084670058, "acc_norm": 0.3287037037037037, "acc_norm_stderr": 0.03203614084670058 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.45588235294117646, "acc_stderr": 0.03495624522015474, "acc_norm": 0.45588235294117646, "acc_norm_stderr": 0.03495624522015474 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.510548523206751, "acc_stderr": 0.032539983791662855, "acc_norm": 0.510548523206751, "acc_norm_stderr": 0.032539983791662855 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.4618834080717489, "acc_stderr": 0.03346015011973228, "acc_norm": 0.4618834080717489, "acc_norm_stderr": 0.03346015011973228 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.44274809160305345, "acc_stderr": 0.04356447202665069, "acc_norm": 0.44274809160305345, "acc_norm_stderr": 0.04356447202665069 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5537190082644629, "acc_stderr": 0.0453793517794788, "acc_norm": 0.5537190082644629, "acc_norm_stderr": 0.0453793517794788 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.42592592592592593, "acc_stderr": 0.0478034362693679, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.0478034362693679 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.38650306748466257, "acc_stderr": 0.038258255488486076, "acc_norm": 0.38650306748466257, "acc_norm_stderr": 0.038258255488486076 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2767857142857143, "acc_stderr": 0.042466243366976256, "acc_norm": 0.2767857142857143, "acc_norm_stderr": 0.042466243366976256 }, "harness|hendrycksTest-management|5": { "acc": 0.34951456310679613, "acc_stderr": 0.047211885060971716, "acc_norm": 0.34951456310679613, "acc_norm_stderr": 0.047211885060971716 }, "harness|hendrycksTest-marketing|5": { "acc": 0.5085470085470085, "acc_stderr": 0.0327513030009703, "acc_norm": 0.5085470085470085, "acc_norm_stderr": 0.0327513030009703 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465918, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465918 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.508301404853129, "acc_stderr": 0.017877498991072, "acc_norm": 0.508301404853129, "acc_norm_stderr": 0.017877498991072 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.407514450867052, "acc_stderr": 0.026454578146931494, "acc_norm": 0.407514450867052, "acc_norm_stderr": 0.026454578146931494 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2927374301675978, "acc_stderr": 0.015218109544410179, "acc_norm": 0.2927374301675978, "acc_norm_stderr": 0.015218109544410179 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.41830065359477125, "acc_stderr": 0.028245134024387285, "acc_norm": 0.41830065359477125, "acc_norm_stderr": 0.028245134024387285 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.39228295819935693, "acc_stderr": 0.027731258647011998, "acc_norm": 0.39228295819935693, "acc_norm_stderr": 0.027731258647011998 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.41358024691358025, "acc_stderr": 0.02740204204026994, "acc_norm": 0.41358024691358025, "acc_norm_stderr": 0.02740204204026994 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3262411347517731, "acc_stderr": 0.027968453043563168, "acc_norm": 0.3262411347517731, "acc_norm_stderr": 0.027968453043563168 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3305084745762712, "acc_stderr": 0.01201414210184297, "acc_norm": 0.3305084745762712, "acc_norm_stderr": 0.01201414210184297 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.40808823529411764, "acc_stderr": 0.029855261393483924, "acc_norm": 0.40808823529411764, "acc_norm_stderr": 0.029855261393483924 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.3709150326797386, "acc_stderr": 0.019542101564854114, "acc_norm": 0.3709150326797386, "acc_norm_stderr": 0.019542101564854114 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.43636363636363634, "acc_stderr": 0.04750185058907297, "acc_norm": 0.43636363636363634, "acc_norm_stderr": 0.04750185058907297 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5020408163265306, "acc_stderr": 0.0320089533497105, "acc_norm": 0.5020408163265306, "acc_norm_stderr": 0.0320089533497105 }, "harness|hendrycksTest-sociology|5": { "acc": 0.4975124378109453, "acc_stderr": 0.03535490150137288, "acc_norm": 0.4975124378109453, "acc_norm_stderr": 0.03535490150137288 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.57, "acc_stderr": 0.04975698519562427, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562427 }, "harness|hendrycksTest-virology|5": { "acc": 0.40963855421686746, "acc_stderr": 0.038284011150790206, "acc_norm": 0.40963855421686746, "acc_norm_stderr": 0.038284011150790206 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.5029239766081871, "acc_stderr": 0.03834759370936839, "acc_norm": 0.5029239766081871, "acc_norm_stderr": 0.03834759370936839 }, "harness|truthfulqa:mc|0": { "mc1": 0.2631578947368421, "mc1_stderr": 0.015415241740237009, "mc2": 0.4200995329344425, "mc2_stderr": 0.01434315654117436 }, "harness|winogrande|5": { "acc": 0.648776637726914, "acc_stderr": 0.013415981370545135 }, "harness|gsm8k|5": { "acc": 0.043214556482183475, "acc_stderr": 0.005600987515237865 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_u-chom__ex-llm-e1
[ "region:us" ]
2023-12-09T14:53:39+00:00
{"pretty_name": "Evaluation run of u-chom/ex-llm-e1", "dataset_summary": "Dataset automatically created during the evaluation run of model [u-chom/ex-llm-e1](https://huggingface.co/u-chom/ex-llm-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_u-chom__ex-llm-e1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T14:50:53.053467](https://huggingface.co/datasets/open-llm-leaderboard/details_u-chom__ex-llm-e1/blob/main/results_2023-12-09T14-50-53.053467.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.39402023545271236,\n \"acc_stderr\": 0.03431633094943852,\n \"acc_norm\": 0.3992950895868925,\n \"acc_norm_stderr\": 0.03515648307530415,\n \"mc1\": 0.2631578947368421,\n \"mc1_stderr\": 0.015415241740237009,\n \"mc2\": 0.4200995329344425,\n \"mc2_stderr\": 0.01434315654117436\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.35921501706484643,\n \"acc_stderr\": 0.014020224155839159,\n \"acc_norm\": 0.3993174061433447,\n \"acc_norm_stderr\": 0.014312094557946698\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5060744871539534,\n \"acc_stderr\": 0.004989413158034801,\n \"acc_norm\": 0.6811392152957578,\n \"acc_norm_stderr\": 0.004650825168905203\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.04060127035236397,\n \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.04060127035236397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.41509433962264153,\n \"acc_stderr\": 0.03032594578928611,\n \"acc_norm\": 0.41509433962264153,\n \"acc_norm_stderr\": 0.03032594578928611\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n \"acc_stderr\": 0.040329990539607195,\n \"acc_norm\": 0.3680555555555556,\n \"acc_norm_stderr\": 0.040329990539607195\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117317,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117317\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3699421965317919,\n \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.3699421965317919,\n \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.030783736757745643,\n \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.030783736757745643\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.040434618619167466,\n \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.040434618619167466\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194974,\n \"acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194974\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3709677419354839,\n \"acc_stderr\": 0.027480541887953593,\n \"acc_norm\": 0.3709677419354839,\n \"acc_norm_stderr\": 0.027480541887953593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3251231527093596,\n \"acc_stderr\": 0.032957975663112704,\n \"acc_norm\": 0.3251231527093596,\n \"acc_norm_stderr\": 0.032957975663112704\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.0389853160557942,\n \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.0389853160557942\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.035402943770953675,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.035402943770953675\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5233160621761658,\n \"acc_stderr\": 0.03604513672442202,\n \"acc_norm\": 0.5233160621761658,\n \"acc_norm_stderr\": 0.03604513672442202\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.31025641025641026,\n \"acc_stderr\": 0.02345467488940429,\n \"acc_norm\": 0.31025641025641026,\n \"acc_norm_stderr\": 0.02345467488940429\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.02995382389188705,\n \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.02995382389188705\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.44220183486238535,\n \"acc_stderr\": 0.02129361320752021,\n \"acc_norm\": 0.44220183486238535,\n \"acc_norm_stderr\": 0.02129361320752021\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3287037037037037,\n \"acc_stderr\": 0.03203614084670058,\n \"acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.03203614084670058\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.03495624522015474,\n \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.03495624522015474\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.510548523206751,\n \"acc_stderr\": 0.032539983791662855,\n \"acc_norm\": 0.510548523206751,\n \"acc_norm_stderr\": 0.032539983791662855\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4618834080717489,\n \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.4618834080717489,\n \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.44274809160305345,\n \"acc_stderr\": 0.04356447202665069,\n \"acc_norm\": 0.44274809160305345,\n \"acc_norm_stderr\": 0.04356447202665069\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\": 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.38650306748466257,\n \"acc_stderr\": 0.038258255488486076,\n \"acc_norm\": 0.38650306748466257,\n \"acc_norm_stderr\": 0.038258255488486076\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.34951456310679613,\n \"acc_stderr\": 0.047211885060971716,\n \"acc_norm\": 0.34951456310679613,\n \"acc_norm_stderr\": 0.047211885060971716\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5085470085470085,\n \"acc_stderr\": 0.0327513030009703,\n \"acc_norm\": 0.5085470085470085,\n \"acc_norm_stderr\": 0.0327513030009703\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465918,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465918\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.508301404853129,\n \"acc_stderr\": 0.017877498991072,\n \"acc_norm\": 0.508301404853129,\n \"acc_norm_stderr\": 0.017877498991072\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.407514450867052,\n \"acc_stderr\": 0.026454578146931494,\n \"acc_norm\": 0.407514450867052,\n \"acc_norm_stderr\": 0.026454578146931494\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2927374301675978,\n \"acc_stderr\": 0.015218109544410179,\n \"acc_norm\": 0.2927374301675978,\n \"acc_norm_stderr\": 0.015218109544410179\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.41830065359477125,\n \"acc_stderr\": 0.028245134024387285,\n \"acc_norm\": 0.41830065359477125,\n \"acc_norm_stderr\": 0.028245134024387285\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.39228295819935693,\n \"acc_stderr\": 0.027731258647011998,\n \"acc_norm\": 0.39228295819935693,\n \"acc_norm_stderr\": 0.027731258647011998\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.41358024691358025,\n \"acc_stderr\": 0.02740204204026994,\n \"acc_norm\": 0.41358024691358025,\n \"acc_norm_stderr\": 0.02740204204026994\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3262411347517731,\n \"acc_stderr\": 0.027968453043563168,\n \"acc_norm\": 0.3262411347517731,\n \"acc_norm_stderr\": 0.027968453043563168\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3305084745762712,\n \"acc_stderr\": 0.01201414210184297,\n \"acc_norm\": 0.3305084745762712,\n \"acc_norm_stderr\": 0.01201414210184297\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.40808823529411764,\n \"acc_stderr\": 0.029855261393483924,\n \"acc_norm\": 0.40808823529411764,\n \"acc_norm_stderr\": 0.029855261393483924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3709150326797386,\n \"acc_stderr\": 0.019542101564854114,\n \"acc_norm\": 0.3709150326797386,\n \"acc_norm_stderr\": 0.019542101564854114\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.43636363636363634,\n \"acc_stderr\": 0.04750185058907297,\n \"acc_norm\": 0.43636363636363634,\n \"acc_norm_stderr\": 0.04750185058907297\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5020408163265306,\n \"acc_stderr\": 0.0320089533497105,\n \"acc_norm\": 0.5020408163265306,\n \"acc_norm_stderr\": 0.0320089533497105\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4975124378109453,\n \"acc_stderr\": 0.03535490150137288,\n \"acc_norm\": 0.4975124378109453,\n \"acc_norm_stderr\": 0.03535490150137288\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562427,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562427\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.40963855421686746,\n \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5029239766081871,\n \"acc_stderr\": 0.03834759370936839,\n \"acc_norm\": 0.5029239766081871,\n \"acc_norm_stderr\": 0.03834759370936839\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n \"mc1_stderr\": 0.015415241740237009,\n \"mc2\": 0.4200995329344425,\n \"mc2_stderr\": 0.01434315654117436\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.648776637726914,\n \"acc_stderr\": 0.013415981370545135\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.043214556482183475,\n \"acc_stderr\": 0.005600987515237865\n }\n}\n```", "repo_url": "https://huggingface.co/u-chom/ex-llm-e1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|arc:challenge|25_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|gsm8k|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hellaswag|10_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T14-50-53.053467.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["**/details_harness|winogrande|5_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T14-50-53.053467.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T14_50_53.053467", "path": ["results_2023-12-09T14-50-53.053467.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T14-50-53.053467.parquet"]}]}]}
2023-12-09T14:54:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of u-chom/ex-llm-e1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model u-chom/ex-llm-e1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T14:50:53.053467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of u-chom/ex-llm-e1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model u-chom/ex-llm-e1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T14:50:53.053467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of u-chom/ex-llm-e1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model u-chom/ex-llm-e1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T14:50:53.053467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of u-chom/ex-llm-e1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model u-chom/ex-llm-e1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T14:50:53.053467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
8cdbfc9e0ba206df55f57724039c1f979d22162b
### Getting Started The AgentSearch-V1 dataset boasts a comprehensive collection of over one billion embeddings, produced using [jina-v2-base](https://huggingface.co/jinaai/jina-embeddings-v2-base-en). The dataset encompasses more than 50 million high-quality documents and over 1 billion passages, covering a vast range of content from sources such as Arxiv, Wikipedia, Project Gutenberg, and includes carefully filtered Creative Commons (CC) data. Our team is dedicated to continuously expanding and enhancing this corpus to improve the search experience. We welcome your thoughts and suggestions – please feel free to reach out with your ideas! To access and utilize the AgentSearch-V1 dataset, you can stream it via HuggingFace with the following Python code: ```python from datasets import load_dataset import json import numpy as np # To stream the entire dataset: ds = load_dataset("SciPhi/AgentSearch-V1", data_files="**/*", split="train", streaming=True) # Optional, stream just the "arxiv" dataset # ds = load_dataset("SciPhi/AgentSearch-V1", data_files="**/*", split="train", data_files="arxiv/*", streaming=True) # To process the entries: for entry in ds: embeddings = np.frombuffer( entry['embeddings'], dtype=np.float32 ).reshape(-1, 768) text_chunks = json.loads(entry['text_chunks']) metadata = json.loads(entry['metadata']) print(f'Embeddings:\n{embeddings}\n\nChunks:\n{text_chunks}\n\nMetadata:\n{metadata}') break ``` --- A full set of scripts to recreate the dataset from scratch can be found [here](https://github.com/SciPhi-AI/agent-search). Further, you may check the docs for details on how to perform RAG over AgentSearch. ### Languages English. ## Dataset Structure The raw dataset structure is as follows: ```json { "url": ..., "title": ..., "metadata": {"url": "...", "timestamp": "...", "source": "...", "language": "...", ...}, "text_chunks": ..., "embeddings": ..., "dataset": "book" | "arxiv" | "wikipedia" | "stack-exchange" | "open-math" | "RedPajama-Data-V2" } ``` ## Dataset Creation This dataset was created as a step towards making humanities most important knowledge openly searchable and LLM optimal. It was created by filtering, cleaning, and augmenting locally publicly available datasets. To cite our work, please use the following: ``` @software{SciPhi2023AgentSearch, author = {SciPhi}, title = {AgentSearch [ΨΦ]: A Comprehensive Agent-First Framework and Dataset for Webscale Search}, year = {2023}, url = {https://github.com/SciPhi-AI/agent-search} } ``` ### Source Data ``` @ONLINE{wikidump, author = "Wikimedia Foundation", title = "Wikimedia Downloads", url = "https://dumps.wikimedia.org" } ``` ``` @misc{paster2023openwebmath, title={OpenWebMath: An Open Dataset of High-Quality Mathematical Web Text}, author={Keiran Paster and Marco Dos Santos and Zhangir Azerbayev and Jimmy Ba}, year={2023}, eprint={2310.06786}, archivePrefix={arXiv}, primaryClass={cs.AI} } ``` ``` @software{together2023redpajama, author = {Together Computer}, title = {RedPajama: An Open Source Recipe to Reproduce LLaMA training dataset}, month = April, year = 2023, url = {https://github.com/togethercomputer/RedPajama-Data} } ``` ### License Please refer to the licenses of the data subsets you use. * [Open-Web (Common Crawl Foundation Terms of Use)](https://commoncrawl.org/terms-of-use/full/) * Books: [the_pile_books3 license](https://huggingface.co/datasets/the_pile_books3#licensing-information) and [pg19 license](https://huggingface.co/datasets/pg19#licensing-information) * [ArXiv Terms of Use](https://info.arxiv.org/help/api/tou.html) * [Wikipedia License](https://huggingface.co/datasets/wikipedia#licensing-information) * [StackExchange license on the Internet Archive](https://archive.org/details/stackexchange) <!-- ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed] -->
SciPhi/AgentSearch-V1
[ "task_categories:text-generation", "size_categories:1B<n<10B", "language:en", "arxiv:2310.06786", "region:us" ]
2023-12-09T15:05:54+00:00
{"language": ["en"], "size_categories": ["1B<n<10B"], "task_categories": ["text-generation"], "pretty_name": "AgentSearch-V1", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "**/*.parquet"}]}]}
2024-01-14T03:54:39+00:00
[ "2310.06786" ]
[ "en" ]
TAGS #task_categories-text-generation #size_categories-1B<n<10B #language-English #arxiv-2310.06786 #region-us
### Getting Started The AgentSearch-V1 dataset boasts a comprehensive collection of over one billion embeddings, produced using jina-v2-base. The dataset encompasses more than 50 million high-quality documents and over 1 billion passages, covering a vast range of content from sources such as Arxiv, Wikipedia, Project Gutenberg, and includes carefully filtered Creative Commons (CC) data. Our team is dedicated to continuously expanding and enhancing this corpus to improve the search experience. We welcome your thoughts and suggestions – please feel free to reach out with your ideas! To access and utilize the AgentSearch-V1 dataset, you can stream it via HuggingFace with the following Python code: --- A full set of scripts to recreate the dataset from scratch can be found here. Further, you may check the docs for details on how to perform RAG over AgentSearch. ### Languages English. ## Dataset Structure The raw dataset structure is as follows: ## Dataset Creation This dataset was created as a step towards making humanities most important knowledge openly searchable and LLM optimal. It was created by filtering, cleaning, and augmenting locally publicly available datasets. To cite our work, please use the following: ### Source Data ### License Please refer to the licenses of the data subsets you use. * Open-Web (Common Crawl Foundation Terms of Use) * Books: the_pile_books3 license and pg19 license * ArXiv Terms of Use * Wikipedia License * StackExchange license on the Internet Archive
[ "### Getting Started\n\nThe AgentSearch-V1 dataset boasts a comprehensive collection of over one billion embeddings, produced using jina-v2-base. The dataset encompasses more than 50 million high-quality documents and over 1 billion passages, covering a vast range of content from sources such as Arxiv, Wikipedia, Project Gutenberg, and includes carefully filtered Creative Commons (CC) data. Our team is dedicated to continuously expanding and enhancing this corpus to improve the search experience. We welcome your thoughts and suggestions – please feel free to reach out with your ideas!\n\nTo access and utilize the AgentSearch-V1 dataset, you can stream it via HuggingFace with the following Python code:\n\n\n\n---\n\nA full set of scripts to recreate the dataset from scratch can be found here. Further, you may check the docs for details on how to perform RAG over AgentSearch.", "### Languages\n\nEnglish.", "## Dataset Structure\n\nThe raw dataset structure is as follows:", "## Dataset Creation\n\nThis dataset was created as a step towards making humanities most important knowledge openly searchable and LLM optimal. It was created by filtering, cleaning, and augmenting locally publicly available datasets.\n\nTo cite our work, please use the following:", "### Source Data", "### License\nPlease refer to the licenses of the data subsets you use.\n\n* Open-Web (Common Crawl Foundation Terms of Use)\n* Books: the_pile_books3 license and pg19 license\n* ArXiv Terms of Use\n* Wikipedia License\n* StackExchange license on the Internet Archive" ]
[ "TAGS\n#task_categories-text-generation #size_categories-1B<n<10B #language-English #arxiv-2310.06786 #region-us \n", "### Getting Started\n\nThe AgentSearch-V1 dataset boasts a comprehensive collection of over one billion embeddings, produced using jina-v2-base. The dataset encompasses more than 50 million high-quality documents and over 1 billion passages, covering a vast range of content from sources such as Arxiv, Wikipedia, Project Gutenberg, and includes carefully filtered Creative Commons (CC) data. Our team is dedicated to continuously expanding and enhancing this corpus to improve the search experience. We welcome your thoughts and suggestions – please feel free to reach out with your ideas!\n\nTo access and utilize the AgentSearch-V1 dataset, you can stream it via HuggingFace with the following Python code:\n\n\n\n---\n\nA full set of scripts to recreate the dataset from scratch can be found here. Further, you may check the docs for details on how to perform RAG over AgentSearch.", "### Languages\n\nEnglish.", "## Dataset Structure\n\nThe raw dataset structure is as follows:", "## Dataset Creation\n\nThis dataset was created as a step towards making humanities most important knowledge openly searchable and LLM optimal. It was created by filtering, cleaning, and augmenting locally publicly available datasets.\n\nTo cite our work, please use the following:", "### Source Data", "### License\nPlease refer to the licenses of the data subsets you use.\n\n* Open-Web (Common Crawl Foundation Terms of Use)\n* Books: the_pile_books3 license and pg19 license\n* ArXiv Terms of Use\n* Wikipedia License\n* StackExchange license on the Internet Archive" ]
[ 42, 197, 6, 16, 61, 4, 68 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-1B<n<10B #language-English #arxiv-2310.06786 #region-us \n### Getting Started\n\nThe AgentSearch-V1 dataset boasts a comprehensive collection of over one billion embeddings, produced using jina-v2-base. The dataset encompasses more than 50 million high-quality documents and over 1 billion passages, covering a vast range of content from sources such as Arxiv, Wikipedia, Project Gutenberg, and includes carefully filtered Creative Commons (CC) data. Our team is dedicated to continuously expanding and enhancing this corpus to improve the search experience. We welcome your thoughts and suggestions – please feel free to reach out with your ideas!\n\nTo access and utilize the AgentSearch-V1 dataset, you can stream it via HuggingFace with the following Python code:\n\n\n\n---\n\nA full set of scripts to recreate the dataset from scratch can be found here. Further, you may check the docs for details on how to perform RAG over AgentSearch.### Languages\n\nEnglish.## Dataset Structure\n\nThe raw dataset structure is as follows:## Dataset Creation\n\nThis dataset was created as a step towards making humanities most important knowledge openly searchable and LLM optimal. It was created by filtering, cleaning, and augmenting locally publicly available datasets.\n\nTo cite our work, please use the following:### Source Data### License\nPlease refer to the licenses of the data subsets you use.\n\n* Open-Web (Common Crawl Foundation Terms of Use)\n* Books: the_pile_books3 license and pg19 license\n* ArXiv Terms of Use\n* Wikipedia License\n* StackExchange license on the Internet Archive" ]
2ebe02eaef527289a4e2e50a19ff823019681b41
## Description The neverending music video channel ## Model SVD ## LoRA jbilcke-hf/sdxl-cinematic-2 ## Voice Cloée ## Prompt A video channel which produces dance music videos all day long!
jbilcke-hf/ai-tube-latentmusik
[ "license:cc-by-nc-sa-4.0", "region:us" ]
2023-12-09T15:08:24+00:00
{"license": "cc-by-nc-sa-4.0", "pretty_name": "Latentmusik"}
2023-12-12T22:54:44+00:00
[]
[]
TAGS #license-cc-by-nc-sa-4.0 #region-us
## Description The neverending music video channel ## Model SVD ## LoRA jbilcke-hf/sdxl-cinematic-2 ## Voice Cloée ## Prompt A video channel which produces dance music videos all day long!
[ "## Description\n\nThe neverending music video channel", "## Model\n\nSVD", "## LoRA\n\njbilcke-hf/sdxl-cinematic-2", "## Voice\n\nCloée", "## Prompt\n\nA video channel which produces dance music videos all day long!" ]
[ "TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n", "## Description\n\nThe neverending music video channel", "## Model\n\nSVD", "## LoRA\n\njbilcke-hf/sdxl-cinematic-2", "## Voice\n\nCloée", "## Prompt\n\nA video channel which produces dance music videos all day long!" ]
[ 19, 9, 4, 17, 4, 17 ]
[ "passage: TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n## Description\n\nThe neverending music video channel## Model\n\nSVD## LoRA\n\njbilcke-hf/sdxl-cinematic-2## Voice\n\nCloée## Prompt\n\nA video channel which produces dance music videos all day long!" ]
0d0df966a4e6d80d69a4fc0f35048542d86a62ee
# Dataset Card for Evaluation run of xxyyy123/Mistral-dpo-v1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/xxyyy123/Mistral-dpo-v1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [xxyyy123/Mistral-dpo-v1](https://huggingface.co/xxyyy123/Mistral-dpo-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_xxyyy123__Mistral-dpo-v1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T15:21:55.337757](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__Mistral-dpo-v1/blob/main/results_2023-12-09T15-21-55.337757.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6327450857688259, "acc_stderr": 0.03239847501378947, "acc_norm": 0.6369776561937077, "acc_norm_stderr": 0.03304786152616907, "mc1": 0.3525091799265606, "mc1_stderr": 0.016724646380756547, "mc2": 0.50494275538215, "mc2_stderr": 0.015065297117078024 }, "harness|arc:challenge|25": { "acc": 0.5972696245733788, "acc_stderr": 0.014332236306790147, "acc_norm": 0.6348122866894198, "acc_norm_stderr": 0.014070265519268802 }, "harness|hellaswag|10": { "acc": 0.6350328619796853, "acc_stderr": 0.00480437056385622, "acc_norm": 0.8358892650866361, "acc_norm_stderr": 0.0036961908325474184 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.04605661864718381, "acc_norm": 0.3, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6222222222222222, "acc_stderr": 0.04188307537595852, "acc_norm": 0.6222222222222222, "acc_norm_stderr": 0.04188307537595852 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6710526315789473, "acc_stderr": 0.03823428969926604, "acc_norm": 0.6710526315789473, "acc_norm_stderr": 0.03823428969926604 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6867924528301886, "acc_stderr": 0.028544793319055326, "acc_norm": 0.6867924528301886, "acc_norm_stderr": 0.028544793319055326 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7222222222222222, "acc_stderr": 0.037455547914624555, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.037455547914624555 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.04793724854411019, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.04724007352383886, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.04724007352383886 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.03240038086792747, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.04697085136647863, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.04697085136647863 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4827586206896552, "acc_stderr": 0.04164188720169377, "acc_norm": 0.4827586206896552, "acc_norm_stderr": 0.04164188720169377 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3915343915343915, "acc_stderr": 0.025138091388851112, "acc_norm": 0.3915343915343915, "acc_norm_stderr": 0.025138091388851112 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04426266681379909, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04426266681379909 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7548387096774194, "acc_stderr": 0.024472243840895518, "acc_norm": 0.7548387096774194, "acc_norm_stderr": 0.024472243840895518 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.02912652283458682, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.02912652283458682 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8756476683937824, "acc_stderr": 0.02381447708659355, "acc_norm": 0.8756476683937824, "acc_norm_stderr": 0.02381447708659355 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6743589743589744, "acc_stderr": 0.02375966576741229, "acc_norm": 0.6743589743589744, "acc_norm_stderr": 0.02375966576741229 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.028661201116524575, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.028661201116524575 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.03038835355188679, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.03038835355188679 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.0395802723112157, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.0395802723112157 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8256880733944955, "acc_stderr": 0.016265675632010354, "acc_norm": 0.8256880733944955, "acc_norm_stderr": 0.016265675632010354 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5185185185185185, "acc_stderr": 0.03407632093854051, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.03407632093854051 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8186274509803921, "acc_stderr": 0.027044621719474086, "acc_norm": 0.8186274509803921, "acc_norm_stderr": 0.027044621719474086 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7890295358649789, "acc_stderr": 0.02655837250266192, "acc_norm": 0.7890295358649789, "acc_norm_stderr": 0.02655837250266192 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.03160295143776679, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.03160295143776679 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.031921934489347235, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.031921934489347235 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.04058042015646034, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.04058042015646034 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.022801382534597518, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.022801382534597518 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.01354741565866226, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.01354741565866226 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.708092485549133, "acc_stderr": 0.02447699407624734, "acc_norm": 0.708092485549133, "acc_norm_stderr": 0.02447699407624734 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3307262569832402, "acc_stderr": 0.01573502625896612, "acc_norm": 0.3307262569832402, "acc_norm_stderr": 0.01573502625896612 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.738562091503268, "acc_stderr": 0.025160998214292456, "acc_norm": 0.738562091503268, "acc_norm_stderr": 0.025160998214292456 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.02567025924218893, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.02567025924218893 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7253086419753086, "acc_stderr": 0.024836057868294677, "acc_norm": 0.7253086419753086, "acc_norm_stderr": 0.024836057868294677 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.45390070921985815, "acc_stderr": 0.02970045324729147, "acc_norm": 0.45390070921985815, "acc_norm_stderr": 0.02970045324729147 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4302477183833116, "acc_stderr": 0.012645361435115233, "acc_norm": 0.4302477183833116, "acc_norm_stderr": 0.012645361435115233 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.02873932851398357, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.02873932851398357 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6781045751633987, "acc_stderr": 0.018901015322093092, "acc_norm": 0.6781045751633987, "acc_norm_stderr": 0.018901015322093092 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.028795185574291293, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.028795185574291293 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454125, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454125 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.0377525168068637, "acc_norm": 0.83, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.03891364495835817, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.03891364495835817 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8011695906432749, "acc_stderr": 0.030611116557432528, "acc_norm": 0.8011695906432749, "acc_norm_stderr": 0.030611116557432528 }, "harness|truthfulqa:mc|0": { "mc1": 0.3525091799265606, "mc1_stderr": 0.016724646380756547, "mc2": 0.50494275538215, "mc2_stderr": 0.015065297117078024 }, "harness|winogrande|5": { "acc": 0.7932123125493291, "acc_stderr": 0.011382566829235802 }, "harness|gsm8k|5": { "acc": 0.4609552691432904, "acc_stderr": 0.013730428449116337 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_xxyyy123__Mistral-dpo-v1
[ "region:us" ]
2023-12-09T15:24:47+00:00
{"pretty_name": "Evaluation run of xxyyy123/Mistral-dpo-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [xxyyy123/Mistral-dpo-v1](https://huggingface.co/xxyyy123/Mistral-dpo-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__Mistral-dpo-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T15:21:55.337757](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__Mistral-dpo-v1/blob/main/results_2023-12-09T15-21-55.337757.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6327450857688259,\n \"acc_stderr\": 0.03239847501378947,\n \"acc_norm\": 0.6369776561937077,\n \"acc_norm_stderr\": 0.03304786152616907,\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.50494275538215,\n \"mc2_stderr\": 0.015065297117078024\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5972696245733788,\n \"acc_stderr\": 0.014332236306790147,\n \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.014070265519268802\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6350328619796853,\n \"acc_stderr\": 0.00480437056385622,\n \"acc_norm\": 0.8358892650866361,\n \"acc_norm_stderr\": 0.0036961908325474184\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383886,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383886\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851112,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851112\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895518,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010354,\n \"acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010354\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474086,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474086\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597518,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597518\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3307262569832402,\n \"acc_stderr\": 0.01573502625896612,\n \"acc_norm\": 0.3307262569832402,\n \"acc_norm_stderr\": 0.01573502625896612\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729147,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729147\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4302477183833116,\n \"acc_stderr\": 0.012645361435115233,\n \"acc_norm\": 0.4302477183833116,\n \"acc_norm_stderr\": 0.012645361435115233\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.50494275538215,\n \"mc2_stderr\": 0.015065297117078024\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235802\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4609552691432904,\n \"acc_stderr\": 0.013730428449116337\n }\n}\n```", "repo_url": "https://huggingface.co/xxyyy123/Mistral-dpo-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-21-55.337757.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["**/details_harness|winogrande|5_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T15-21-55.337757.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T15_21_55.337757", "path": ["results_2023-12-09T15-21-55.337757.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T15-21-55.337757.parquet"]}]}]}
2023-12-09T15:25:30+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of xxyyy123/Mistral-dpo-v1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model xxyyy123/Mistral-dpo-v1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T15:21:55.337757(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of xxyyy123/Mistral-dpo-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/Mistral-dpo-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:21:55.337757(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of xxyyy123/Mistral-dpo-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/Mistral-dpo-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:21:55.337757(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 171, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of xxyyy123/Mistral-dpo-v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/Mistral-dpo-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T15:21:55.337757(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
6d9ddf341da8005afca10a3b3c726807e140fc2a
# Dataset Card for Evaluation run of perlthoughts/Falkor-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/perlthoughts/Falkor-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [perlthoughts/Falkor-7b](https://huggingface.co/perlthoughts/Falkor-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_perlthoughts__Falkor-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T15:28:39.767223](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Falkor-7b/blob/main/results_2023-12-09T15-28-39.767223.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6427907257764981, "acc_stderr": 0.032414118407172746, "acc_norm": 0.6442720295040886, "acc_norm_stderr": 0.03307021882829144, "mc1": 0.46878824969400246, "mc1_stderr": 0.017469364874577547, "mc2": 0.6307542042080609, "mc2_stderr": 0.015278403284571293 }, "harness|arc:challenge|25": { "acc": 0.6544368600682594, "acc_stderr": 0.013896938461145682, "acc_norm": 0.6825938566552902, "acc_norm_stderr": 0.013602239088038169 }, "harness|hellaswag|10": { "acc": 0.6762597092212707, "acc_stderr": 0.004669459891917688, "acc_norm": 0.8583947420832504, "acc_norm_stderr": 0.003479322860225654 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6, "acc_stderr": 0.042320736951515885, "acc_norm": 0.6, "acc_norm_stderr": 0.042320736951515885 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6792452830188679, "acc_stderr": 0.028727502957880267, "acc_norm": 0.6792452830188679, "acc_norm_stderr": 0.028727502957880267 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7361111111111112, "acc_stderr": 0.03685651095897532, "acc_norm": 0.7361111111111112, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6358381502890174, "acc_stderr": 0.03669072477416907, "acc_norm": 0.6358381502890174, "acc_norm_stderr": 0.03669072477416907 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.048786087144669955, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.048786087144669955 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.04461960433384739, "acc_norm": 0.73, "acc_norm_stderr": 0.04461960433384739 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.03243618636108102, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.03243618636108102 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3994708994708995, "acc_stderr": 0.025225450284067884, "acc_norm": 0.3994708994708995, "acc_norm_stderr": 0.025225450284067884 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677171, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677171 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.02315787934908353, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.02315787934908353 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.035176035403610084, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.035176035403610084 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.028606204289229872, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.028606204289229872 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.022473253332768776, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.022473253332768776 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6692307692307692, "acc_stderr": 0.023854795680971125, "acc_norm": 0.6692307692307692, "acc_norm_stderr": 0.023854795680971125 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.37777777777777777, "acc_stderr": 0.029560707392465725, "acc_norm": 0.37777777777777777, "acc_norm_stderr": 0.029560707392465725 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7184873949579832, "acc_stderr": 0.029213549414372177, "acc_norm": 0.7184873949579832, "acc_norm_stderr": 0.029213549414372177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.41721854304635764, "acc_stderr": 0.040261414976346104, "acc_norm": 0.41721854304635764, "acc_norm_stderr": 0.040261414976346104 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.015555802713590172, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.015555802713590172 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5648148148148148, "acc_stderr": 0.033812000056435254, "acc_norm": 0.5648148148148148, "acc_norm_stderr": 0.033812000056435254 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8137254901960784, "acc_stderr": 0.027325470966716312, "acc_norm": 0.8137254901960784, "acc_norm_stderr": 0.027325470966716312 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601436, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601436 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7361963190184049, "acc_stderr": 0.034624199316156234, "acc_norm": 0.7361963190184049, "acc_norm_stderr": 0.034624199316156234 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507332, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507332 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.013547415658662257, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.013547415658662257 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7138728323699421, "acc_stderr": 0.02433214677913413, "acc_norm": 0.7138728323699421, "acc_norm_stderr": 0.02433214677913413 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.39888268156424583, "acc_stderr": 0.01637696614261008, "acc_norm": 0.39888268156424583, "acc_norm_stderr": 0.01637696614261008 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7189542483660131, "acc_stderr": 0.025738854797818733, "acc_norm": 0.7189542483660131, "acc_norm_stderr": 0.025738854797818733 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.025494259350694912, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.025494259350694912 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7191358024691358, "acc_stderr": 0.025006469755799215, "acc_norm": 0.7191358024691358, "acc_norm_stderr": 0.025006469755799215 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46099290780141844, "acc_stderr": 0.029736592526424438, "acc_norm": 0.46099290780141844, "acc_norm_stderr": 0.029736592526424438 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.45827900912646674, "acc_stderr": 0.01272570165695364, "acc_norm": 0.45827900912646674, "acc_norm_stderr": 0.01272570165695364 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6507352941176471, "acc_stderr": 0.028959755196824873, "acc_norm": 0.6507352941176471, "acc_norm_stderr": 0.028959755196824873 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6470588235294118, "acc_stderr": 0.019333142020797164, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.019333142020797164 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454125, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454125 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.0377525168068637, "acc_norm": 0.83, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8070175438596491, "acc_stderr": 0.030267457554898458, "acc_norm": 0.8070175438596491, "acc_norm_stderr": 0.030267457554898458 }, "harness|truthfulqa:mc|0": { "mc1": 0.46878824969400246, "mc1_stderr": 0.017469364874577547, "mc2": 0.6307542042080609, "mc2_stderr": 0.015278403284571293 }, "harness|winogrande|5": { "acc": 0.8034727703235991, "acc_stderr": 0.011168120593569563 }, "harness|gsm8k|5": { "acc": 0.6050037907505686, "acc_stderr": 0.013465354969973205 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_perlthoughts__Falkor-7b
[ "region:us" ]
2023-12-09T15:31:31+00:00
{"pretty_name": "Evaluation run of perlthoughts/Falkor-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [perlthoughts/Falkor-7b](https://huggingface.co/perlthoughts/Falkor-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Falkor-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T15:28:39.767223](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Falkor-7b/blob/main/results_2023-12-09T15-28-39.767223.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6427907257764981,\n \"acc_stderr\": 0.032414118407172746,\n \"acc_norm\": 0.6442720295040886,\n \"acc_norm_stderr\": 0.03307021882829144,\n \"mc1\": 0.46878824969400246,\n \"mc1_stderr\": 0.017469364874577547,\n \"mc2\": 0.6307542042080609,\n \"mc2_stderr\": 0.015278403284571293\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6544368600682594,\n \"acc_stderr\": 0.013896938461145682,\n \"acc_norm\": 0.6825938566552902,\n \"acc_norm_stderr\": 0.013602239088038169\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6762597092212707,\n \"acc_stderr\": 0.004669459891917688,\n \"acc_norm\": 0.8583947420832504,\n \"acc_norm_stderr\": 0.003479322860225654\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067884,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067884\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.02315787934908353,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.02315787934908353\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465725,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465725\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.029213549414372177,\n \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.029213549414372177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.41721854304635764,\n \"acc_stderr\": 0.040261414976346104,\n \"acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.040261414976346104\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590172,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590172\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.013547415658662257,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.013547415658662257\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n \"acc_stderr\": 0.01637696614261008,\n \"acc_norm\": 0.39888268156424583,\n \"acc_norm_stderr\": 0.01637696614261008\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799215,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799215\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n \"acc_stderr\": 0.01272570165695364,\n \"acc_norm\": 0.45827900912646674,\n \"acc_norm_stderr\": 0.01272570165695364\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824873,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.019333142020797164,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.019333142020797164\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46878824969400246,\n \"mc1_stderr\": 0.017469364874577547,\n \"mc2\": 0.6307542042080609,\n \"mc2_stderr\": 0.015278403284571293\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569563\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6050037907505686,\n \"acc_stderr\": 0.013465354969973205\n }\n}\n```", "repo_url": "https://huggingface.co/perlthoughts/Falkor-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-28-39.767223.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["**/details_harness|winogrande|5_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T15-28-39.767223.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T15_28_39.767223", "path": ["results_2023-12-09T15-28-39.767223.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T15-28-39.767223.parquet"]}]}]}
2023-12-09T15:32:14+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of perlthoughts/Falkor-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model perlthoughts/Falkor-7b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T15:28:39.767223(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of perlthoughts/Falkor-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model perlthoughts/Falkor-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:28:39.767223(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of perlthoughts/Falkor-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model perlthoughts/Falkor-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:28:39.767223(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of perlthoughts/Falkor-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model perlthoughts/Falkor-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T15:28:39.767223(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
3336e4bb59f785bc4bd7282a48460968da8b0b0b
## Description AI-generated lofi beats to work or study 🎧 Work, Play, Rest and Repeat. ## Model SVD ## LoRA ProomptEngineer/pe-lofi-hiphop-lofi-girl-concept ## Voice Muted ## Music lofi hiphop beat loop ## Prompt A channel generating lofi-beat music videos.
jbilcke-hf/ai-tube-lofi-chilled-llama
[ "license:cc-by-nc-4.0", "region:us" ]
2023-12-09T15:33:57+00:00
{"license": "cc-by-nc-4.0", "pretty_name": "ChilledLlama"}
2023-12-14T10:13:42+00:00
[]
[]
TAGS #license-cc-by-nc-4.0 #region-us
## Description AI-generated lofi beats to work or study Work, Play, Rest and Repeat. ## Model SVD ## LoRA ProomptEngineer/pe-lofi-hiphop-lofi-girl-concept ## Voice Muted ## Music lofi hiphop beat loop ## Prompt A channel generating lofi-beat music videos.
[ "## Description\n\nAI-generated lofi beats to work or study \n\nWork, Play, Rest and Repeat.", "## Model\n\nSVD", "## LoRA\n\nProomptEngineer/pe-lofi-hiphop-lofi-girl-concept", "## Voice\n\nMuted", "## Music\n\nlofi hiphop beat loop", "## Prompt\n\nA channel generating lofi-beat music videos." ]
[ "TAGS\n#license-cc-by-nc-4.0 #region-us \n", "## Description\n\nAI-generated lofi beats to work or study \n\nWork, Play, Rest and Repeat.", "## Model\n\nSVD", "## LoRA\n\nProomptEngineer/pe-lofi-hiphop-lofi-girl-concept", "## Voice\n\nMuted", "## Music\n\nlofi hiphop beat loop", "## Prompt\n\nA channel generating lofi-beat music videos." ]
[ 17, 24, 4, 25, 4, 8, 15 ]
[ "passage: TAGS\n#license-cc-by-nc-4.0 #region-us \n## Description\n\nAI-generated lofi beats to work or study \n\nWork, Play, Rest and Repeat.## Model\n\nSVD## LoRA\n\nProomptEngineer/pe-lofi-hiphop-lofi-girl-concept## Voice\n\nMuted## Music\n\nlofi hiphop beat loop## Prompt\n\nA channel generating lofi-beat music videos." ]
3f5858c7e25a716ffae8a5ef89684f0d651263aa
# Dataset Card for Evaluation run of zyh3826/llama2-13b-ft-openllm-leaderboard-v1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/zyh3826/llama2-13b-ft-openllm-leaderboard-v1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [zyh3826/llama2-13b-ft-openllm-leaderboard-v1](https://huggingface.co/zyh3826/llama2-13b-ft-openllm-leaderboard-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_zyh3826__llama2-13b-ft-openllm-leaderboard-v1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T15:33:42.644192](https://huggingface.co/datasets/open-llm-leaderboard/details_zyh3826__llama2-13b-ft-openllm-leaderboard-v1/blob/main/results_2023-12-09T15-33-42.644192.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6016495918139398, "acc_stderr": 0.03270798736533002, "acc_norm": 0.612894192678486, "acc_norm_stderr": 0.033541474205616734, "mc1": 0.28151774785801714, "mc1_stderr": 0.01574402724825605, "mc2": 0.40723683293857477, "mc2_stderr": 0.01336809717170015 }, "harness|arc:challenge|25": { "acc": 0.552901023890785, "acc_stderr": 0.014529380160526842, "acc_norm": 0.5964163822525598, "acc_norm_stderr": 0.01433715891426844 }, "harness|hellaswag|10": { "acc": 0.6276638119896435, "acc_stderr": 0.004824393076826628, "acc_norm": 0.8314080860386377, "acc_norm_stderr": 0.0037362592995204874 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.047609522856952365, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952365 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5259259259259259, "acc_stderr": 0.04313531696750575, "acc_norm": 0.5259259259259259, "acc_norm_stderr": 0.04313531696750575 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.631578947368421, "acc_stderr": 0.03925523381052932, "acc_norm": 0.631578947368421, "acc_norm_stderr": 0.03925523381052932 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.04999999999999999, "acc_norm": 0.55, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6377358490566037, "acc_stderr": 0.029582245128384303, "acc_norm": 0.6377358490566037, "acc_norm_stderr": 0.029582245128384303 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6944444444444444, "acc_stderr": 0.03852084696008534, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.03852084696008534 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6069364161849711, "acc_stderr": 0.03724249595817731, "acc_norm": 0.6069364161849711, "acc_norm_stderr": 0.03724249595817731 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.04784060704105653, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.04784060704105653 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4765957446808511, "acc_stderr": 0.03265019475033582, "acc_norm": 0.4765957446808511, "acc_norm_stderr": 0.03265019475033582 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.044346007015849245, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.044346007015849245 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3386243386243386, "acc_stderr": 0.024373197867983056, "acc_norm": 0.3386243386243386, "acc_norm_stderr": 0.024373197867983056 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3968253968253968, "acc_stderr": 0.043758884927270605, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.043758884927270605 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7193548387096774, "acc_stderr": 0.02556060472102288, "acc_norm": 0.7193548387096774, "acc_norm_stderr": 0.02556060472102288 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7272727272727273, "acc_stderr": 0.03477691162163659, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.03477691162163659 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8131313131313131, "acc_stderr": 0.027772533334218957, "acc_norm": 0.8131313131313131, "acc_norm_stderr": 0.027772533334218957 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.025787723180723875, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.025787723180723875 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5948717948717949, "acc_stderr": 0.024890471769938145, "acc_norm": 0.5948717948717949, "acc_norm_stderr": 0.024890471769938145 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.02874204090394849, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.02874204090394849 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6092436974789915, "acc_stderr": 0.03169380235712996, "acc_norm": 0.6092436974789915, "acc_norm_stderr": 0.03169380235712996 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7944954128440367, "acc_stderr": 0.017324352325016015, "acc_norm": 0.7944954128440367, "acc_norm_stderr": 0.017324352325016015 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49074074074074076, "acc_stderr": 0.034093869469927006, "acc_norm": 0.49074074074074076, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.02584501798692692, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.02584501798692692 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.02595502084162111, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.02595502084162111 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6591928251121076, "acc_stderr": 0.03181149747055359, "acc_norm": 0.6591928251121076, "acc_norm_stderr": 0.03181149747055359 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7251908396946565, "acc_stderr": 0.03915345408847836, "acc_norm": 0.7251908396946565, "acc_norm_stderr": 0.03915345408847836 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.03826076324884866, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.03826076324884866 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7177914110429447, "acc_stderr": 0.03536117886664742, "acc_norm": 0.7177914110429447, "acc_norm_stderr": 0.03536117886664742 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.36607142857142855, "acc_stderr": 0.0457237235873743, "acc_norm": 0.36607142857142855, "acc_norm_stderr": 0.0457237235873743 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.03989139859531771, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.03989139859531771 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8135376756066411, "acc_stderr": 0.013927751372001505, "acc_norm": 0.8135376756066411, "acc_norm_stderr": 0.013927751372001505 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6647398843930635, "acc_stderr": 0.025416003773165538, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.025416003773165538 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41675977653631285, "acc_stderr": 0.016489134962438954, "acc_norm": 0.41675977653631285, "acc_norm_stderr": 0.016489134962438954 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.673202614379085, "acc_stderr": 0.026857294663281413, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.026857294663281413 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6591639871382636, "acc_stderr": 0.026920841260776165, "acc_norm": 0.6591639871382636, "acc_norm_stderr": 0.026920841260776165 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7345679012345679, "acc_stderr": 0.024569223600460845, "acc_norm": 0.7345679012345679, "acc_norm_stderr": 0.024569223600460845 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4589308996088657, "acc_stderr": 0.012727084826799798, "acc_norm": 0.4589308996088657, "acc_norm_stderr": 0.012727084826799798 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6029411764705882, "acc_stderr": 0.029722152099280065, "acc_norm": 0.6029411764705882, "acc_norm_stderr": 0.029722152099280065 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6062091503267973, "acc_stderr": 0.019766211991073063, "acc_norm": 0.6062091503267973, "acc_norm_stderr": 0.019766211991073063 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6775510204081633, "acc_stderr": 0.02992310056368391, "acc_norm": 0.6775510204081633, "acc_norm_stderr": 0.02992310056368391 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8059701492537313, "acc_stderr": 0.027962677604768907, "acc_norm": 0.8059701492537313, "acc_norm_stderr": 0.027962677604768907 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.03379976689896308, "acc_norm": 0.87, "acc_norm_stderr": 0.03379976689896308 }, "harness|hendrycksTest-virology|5": { "acc": 0.4819277108433735, "acc_stderr": 0.038899512528272166, "acc_norm": 0.4819277108433735, "acc_norm_stderr": 0.038899512528272166 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.783625730994152, "acc_stderr": 0.031581495393387324, "acc_norm": 0.783625730994152, "acc_norm_stderr": 0.031581495393387324 }, "harness|truthfulqa:mc|0": { "mc1": 0.28151774785801714, "mc1_stderr": 0.01574402724825605, "mc2": 0.40723683293857477, "mc2_stderr": 0.01336809717170015 }, "harness|winogrande|5": { "acc": 0.7734806629834254, "acc_stderr": 0.011764149054698329 }, "harness|gsm8k|5": { "acc": 0.013646702047005308, "acc_stderr": 0.0031957470754808027 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_zyh3826__llama2-13b-ft-openllm-leaderboard-v1
[ "region:us" ]
2023-12-09T15:35:57+00:00
{"pretty_name": "Evaluation run of zyh3826/llama2-13b-ft-openllm-leaderboard-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [zyh3826/llama2-13b-ft-openllm-leaderboard-v1](https://huggingface.co/zyh3826/llama2-13b-ft-openllm-leaderboard-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zyh3826__llama2-13b-ft-openllm-leaderboard-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T15:33:42.644192](https://huggingface.co/datasets/open-llm-leaderboard/details_zyh3826__llama2-13b-ft-openllm-leaderboard-v1/blob/main/results_2023-12-09T15-33-42.644192.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6016495918139398,\n \"acc_stderr\": 0.03270798736533002,\n \"acc_norm\": 0.612894192678486,\n \"acc_norm_stderr\": 0.033541474205616734,\n \"mc1\": 0.28151774785801714,\n \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.40723683293857477,\n \"mc2_stderr\": 0.01336809717170015\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.552901023890785,\n \"acc_stderr\": 0.014529380160526842,\n \"acc_norm\": 0.5964163822525598,\n \"acc_norm_stderr\": 0.01433715891426844\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6276638119896435,\n \"acc_stderr\": 0.004824393076826628,\n \"acc_norm\": 0.8314080860386377,\n \"acc_norm_stderr\": 0.0037362592995204874\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.029582245128384303,\n \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.029582245128384303\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3386243386243386,\n \"acc_stderr\": 0.024373197867983056,\n \"acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.024373197867983056\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n \"acc_stderr\": 0.02556060472102288,\n \"acc_norm\": 0.7193548387096774,\n \"acc_norm_stderr\": 0.02556060472102288\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218957,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218957\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394849,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394849\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.03169380235712996,\n \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.03169380235712996\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7944954128440367,\n \"acc_stderr\": 0.017324352325016015,\n \"acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.017324352325016015\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162111,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162111\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n \"acc_stderr\": 0.013927751372001505,\n \"acc_norm\": 0.8135376756066411,\n \"acc_norm_stderr\": 0.013927751372001505\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165538,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165538\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41675977653631285,\n \"acc_stderr\": 0.016489134962438954,\n \"acc_norm\": 0.41675977653631285,\n \"acc_norm_stderr\": 0.016489134962438954\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281413,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281413\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n \"acc_stderr\": 0.026920841260776165,\n \"acc_norm\": 0.6591639871382636,\n \"acc_norm_stderr\": 0.026920841260776165\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n \"acc_stderr\": 0.012727084826799798,\n \"acc_norm\": 0.4589308996088657,\n \"acc_norm_stderr\": 0.012727084826799798\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.029722152099280065,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.029722152099280065\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6062091503267973,\n \"acc_stderr\": 0.019766211991073063,\n \"acc_norm\": 0.6062091503267973,\n \"acc_norm_stderr\": 0.019766211991073063\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.02992310056368391,\n \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.02992310056368391\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768907,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768907\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.40723683293857477,\n \"mc2_stderr\": 0.01336809717170015\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698329\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.013646702047005308,\n \"acc_stderr\": 0.0031957470754808027\n }\n}\n```", "repo_url": "https://huggingface.co/zyh3826/llama2-13b-ft-openllm-leaderboard-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-33-42.644192.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["**/details_harness|winogrande|5_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T15-33-42.644192.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T15_33_42.644192", "path": ["results_2023-12-09T15-33-42.644192.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T15-33-42.644192.parquet"]}]}]}
2023-12-09T15:36:38+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of zyh3826/llama2-13b-ft-openllm-leaderboard-v1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model zyh3826/llama2-13b-ft-openllm-leaderboard-v1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T15:33:42.644192(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of zyh3826/llama2-13b-ft-openllm-leaderboard-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model zyh3826/llama2-13b-ft-openllm-leaderboard-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:33:42.644192(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of zyh3826/llama2-13b-ft-openllm-leaderboard-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model zyh3826/llama2-13b-ft-openllm-leaderboard-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:33:42.644192(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 32, 31, 181, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of zyh3826/llama2-13b-ft-openllm-leaderboard-v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model zyh3826/llama2-13b-ft-openllm-leaderboard-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T15:33:42.644192(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
4d58691447a538a62d2c656920aeb46be8461ef5
# Sampled Trelis/big_patent_sample Dataset This is a sampled Trelis/big_patent_sample dataset containing rows of data with descriptions between 60,000 to 250,000 characters in length.
Trelis/big_patent_60k_to_250k_characters
[ "region:us" ]
2023-12-09T15:37:09+00:00
{}
2023-12-09T15:38:19+00:00
[]
[]
TAGS #region-us
# Sampled Trelis/big_patent_sample Dataset This is a sampled Trelis/big_patent_sample dataset containing rows of data with descriptions between 60,000 to 250,000 characters in length.
[ "# Sampled Trelis/big_patent_sample Dataset\nThis is a sampled Trelis/big_patent_sample dataset containing rows of data with descriptions between 60,000 to 250,000 characters in length." ]
[ "TAGS\n#region-us \n", "# Sampled Trelis/big_patent_sample Dataset\nThis is a sampled Trelis/big_patent_sample dataset containing rows of data with descriptions between 60,000 to 250,000 characters in length." ]
[ 6, 53 ]
[ "passage: TAGS\n#region-us \n# Sampled Trelis/big_patent_sample Dataset\nThis is a sampled Trelis/big_patent_sample dataset containing rows of data with descriptions between 60,000 to 250,000 characters in length." ]
f691a130d17d88b78a97a82664bb0ec696dcab88
# Dataset Card for Evaluation run of haoranxu/ALMA-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/haoranxu/ALMA-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [haoranxu/ALMA-13B](https://huggingface.co/haoranxu/ALMA-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_haoranxu__ALMA-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T15:43:59.183673](https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-13B/blob/main/results_2023-12-09T15-43-59.183673.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4975926896834348, "acc_stderr": 0.0340184872709999, "acc_norm": 0.5055471649834049, "acc_norm_stderr": 0.034951829731267196, "mc1": 0.25091799265605874, "mc1_stderr": 0.015176985027707693, "mc2": 0.3756948163452805, "mc2_stderr": 0.013646478427855928 }, "harness|arc:challenge|25": { "acc": 0.537542662116041, "acc_stderr": 0.014570144495075581, "acc_norm": 0.568259385665529, "acc_norm_stderr": 0.014474591427196204 }, "harness|hellaswag|10": { "acc": 0.5962955586536547, "acc_stderr": 0.0048963681857652356, "acc_norm": 0.8029277036446923, "acc_norm_stderr": 0.0039697442326224195 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.04461960433384739, "acc_norm": 0.27, "acc_norm_stderr": 0.04461960433384739 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04292596718256981, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4934210526315789, "acc_stderr": 0.040685900502249704, "acc_norm": 0.4934210526315789, "acc_norm_stderr": 0.040685900502249704 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.049999999999999996, "acc_norm": 0.55, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5358490566037736, "acc_stderr": 0.030693675018458003, "acc_norm": 0.5358490566037736, "acc_norm_stderr": 0.030693675018458003 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5416666666666666, "acc_stderr": 0.04166666666666665, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.04166666666666665 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4393063583815029, "acc_stderr": 0.037842719328874674, "acc_norm": 0.4393063583815029, "acc_norm_stderr": 0.037842719328874674 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.28431372549019607, "acc_stderr": 0.04488482852329017, "acc_norm": 0.28431372549019607, "acc_norm_stderr": 0.04488482852329017 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4085106382978723, "acc_stderr": 0.03213418026701576, "acc_norm": 0.4085106382978723, "acc_norm_stderr": 0.03213418026701576 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4413793103448276, "acc_stderr": 0.04137931034482758, "acc_norm": 0.4413793103448276, "acc_norm_stderr": 0.04137931034482758 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.34656084656084657, "acc_stderr": 0.024508777521028424, "acc_norm": 0.34656084656084657, "acc_norm_stderr": 0.024508777521028424 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2619047619047619, "acc_stderr": 0.03932537680392871, "acc_norm": 0.2619047619047619, "acc_norm_stderr": 0.03932537680392871 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5709677419354838, "acc_stderr": 0.028156036538233193, "acc_norm": 0.5709677419354838, "acc_norm_stderr": 0.028156036538233193 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4187192118226601, "acc_stderr": 0.034711928605184676, "acc_norm": 0.4187192118226601, "acc_norm_stderr": 0.034711928605184676 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6606060606060606, "acc_stderr": 0.03697442205031595, "acc_norm": 0.6606060606060606, "acc_norm_stderr": 0.03697442205031595 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6161616161616161, "acc_stderr": 0.03464881675016339, "acc_norm": 0.6161616161616161, "acc_norm_stderr": 0.03464881675016339 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.694300518134715, "acc_stderr": 0.033248379397581594, "acc_norm": 0.694300518134715, "acc_norm_stderr": 0.033248379397581594 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.47692307692307695, "acc_stderr": 0.025323990861736125, "acc_norm": 0.47692307692307695, "acc_norm_stderr": 0.025323990861736125 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24814814814814815, "acc_stderr": 0.0263357394040558, "acc_norm": 0.24814814814814815, "acc_norm_stderr": 0.0263357394040558 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5294117647058824, "acc_stderr": 0.03242225027115007, "acc_norm": 0.5294117647058824, "acc_norm_stderr": 0.03242225027115007 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.25165562913907286, "acc_stderr": 0.035433042343899844, "acc_norm": 0.25165562913907286, "acc_norm_stderr": 0.035433042343899844 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6403669724770642, "acc_stderr": 0.02057523466012378, "acc_norm": 0.6403669724770642, "acc_norm_stderr": 0.02057523466012378 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3425925925925926, "acc_stderr": 0.03236585252602158, "acc_norm": 0.3425925925925926, "acc_norm_stderr": 0.03236585252602158 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6225490196078431, "acc_stderr": 0.03402272044340703, "acc_norm": 0.6225490196078431, "acc_norm_stderr": 0.03402272044340703 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6624472573839663, "acc_stderr": 0.03078154910202622, "acc_norm": 0.6624472573839663, "acc_norm_stderr": 0.03078154910202622 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6547085201793722, "acc_stderr": 0.03191100192835794, "acc_norm": 0.6547085201793722, "acc_norm_stderr": 0.03191100192835794 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5725190839694656, "acc_stderr": 0.043389203057924, "acc_norm": 0.5725190839694656, "acc_norm_stderr": 0.043389203057924 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6776859504132231, "acc_stderr": 0.042664163633521685, "acc_norm": 0.6776859504132231, "acc_norm_stderr": 0.042664163633521685 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6296296296296297, "acc_stderr": 0.04668408033024931, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.04668408033024931 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5644171779141104, "acc_stderr": 0.038956324641389366, "acc_norm": 0.5644171779141104, "acc_norm_stderr": 0.038956324641389366 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.32142857142857145, "acc_stderr": 0.04432804055291519, "acc_norm": 0.32142857142857145, "acc_norm_stderr": 0.04432804055291519 }, "harness|hendrycksTest-management|5": { "acc": 0.5728155339805825, "acc_stderr": 0.04897957737781169, "acc_norm": 0.5728155339805825, "acc_norm_stderr": 0.04897957737781169 }, "harness|hendrycksTest-marketing|5": { "acc": 0.688034188034188, "acc_stderr": 0.03035152732334493, "acc_norm": 0.688034188034188, "acc_norm_stderr": 0.03035152732334493 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6819923371647509, "acc_stderr": 0.016653486275615376, "acc_norm": 0.6819923371647509, "acc_norm_stderr": 0.016653486275615376 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5867052023121387, "acc_stderr": 0.026511261369409247, "acc_norm": 0.5867052023121387, "acc_norm_stderr": 0.026511261369409247 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2860335195530726, "acc_stderr": 0.015113972129062125, "acc_norm": 0.2860335195530726, "acc_norm_stderr": 0.015113972129062125 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5620915032679739, "acc_stderr": 0.028408302020332687, "acc_norm": 0.5620915032679739, "acc_norm_stderr": 0.028408302020332687 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5980707395498392, "acc_stderr": 0.027846476005930473, "acc_norm": 0.5980707395498392, "acc_norm_stderr": 0.027846476005930473 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5709876543209876, "acc_stderr": 0.027538925613470863, "acc_norm": 0.5709876543209876, "acc_norm_stderr": 0.027538925613470863 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.40070921985815605, "acc_stderr": 0.029233465745573083, "acc_norm": 0.40070921985815605, "acc_norm_stderr": 0.029233465745573083 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.36897001303780963, "acc_stderr": 0.012323936650174862, "acc_norm": 0.36897001303780963, "acc_norm_stderr": 0.012323936650174862 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.44485294117647056, "acc_stderr": 0.030187532060329387, "acc_norm": 0.44485294117647056, "acc_norm_stderr": 0.030187532060329387 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.49836601307189543, "acc_stderr": 0.020227726838150124, "acc_norm": 0.49836601307189543, "acc_norm_stderr": 0.020227726838150124 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5272727272727272, "acc_stderr": 0.04782001791380061, "acc_norm": 0.5272727272727272, "acc_norm_stderr": 0.04782001791380061 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5142857142857142, "acc_stderr": 0.03199615232806286, "acc_norm": 0.5142857142857142, "acc_norm_stderr": 0.03199615232806286 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7114427860696517, "acc_stderr": 0.03203841040213321, "acc_norm": 0.7114427860696517, "acc_norm_stderr": 0.03203841040213321 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.74, "acc_stderr": 0.0440844002276808, "acc_norm": 0.74, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-virology|5": { "acc": 0.4397590361445783, "acc_stderr": 0.03864139923699122, "acc_norm": 0.4397590361445783, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7192982456140351, "acc_stderr": 0.034462962170884265, "acc_norm": 0.7192982456140351, "acc_norm_stderr": 0.034462962170884265 }, "harness|truthfulqa:mc|0": { "mc1": 0.25091799265605874, "mc1_stderr": 0.015176985027707693, "mc2": 0.3756948163452805, "mc2_stderr": 0.013646478427855928 }, "harness|winogrande|5": { "acc": 0.7632202052091555, "acc_stderr": 0.011947592365207392 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_haoranxu__ALMA-13B
[ "region:us" ]
2023-12-09T15:46:54+00:00
{"pretty_name": "Evaluation run of haoranxu/ALMA-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [haoranxu/ALMA-13B](https://huggingface.co/haoranxu/ALMA-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_haoranxu__ALMA-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T15:43:59.183673](https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-13B/blob/main/results_2023-12-09T15-43-59.183673.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4975926896834348,\n \"acc_stderr\": 0.0340184872709999,\n \"acc_norm\": 0.5055471649834049,\n \"acc_norm_stderr\": 0.034951829731267196,\n \"mc1\": 0.25091799265605874,\n \"mc1_stderr\": 0.015176985027707693,\n \"mc2\": 0.3756948163452805,\n \"mc2_stderr\": 0.013646478427855928\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.537542662116041,\n \"acc_stderr\": 0.014570144495075581,\n \"acc_norm\": 0.568259385665529,\n \"acc_norm_stderr\": 0.014474591427196204\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5962955586536547,\n \"acc_stderr\": 0.0048963681857652356,\n \"acc_norm\": 0.8029277036446923,\n \"acc_norm_stderr\": 0.0039697442326224195\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5358490566037736,\n \"acc_stderr\": 0.030693675018458003,\n \"acc_norm\": 0.5358490566037736,\n \"acc_norm_stderr\": 0.030693675018458003\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.4393063583815029,\n \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028424,\n \"acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028424\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5709677419354838,\n \"acc_stderr\": 0.028156036538233193,\n \"acc_norm\": 0.5709677419354838,\n \"acc_norm_stderr\": 0.028156036538233193\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.034711928605184676,\n \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.034711928605184676\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031595,\n \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031595\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6161616161616161,\n \"acc_stderr\": 0.03464881675016339,\n \"acc_norm\": 0.6161616161616161,\n \"acc_norm_stderr\": 0.03464881675016339\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.694300518134715,\n \"acc_stderr\": 0.033248379397581594,\n \"acc_norm\": 0.694300518134715,\n \"acc_norm_stderr\": 0.033248379397581594\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.47692307692307695,\n \"acc_stderr\": 0.025323990861736125,\n \"acc_norm\": 0.47692307692307695,\n \"acc_norm_stderr\": 0.025323990861736125\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03242225027115007,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03242225027115007\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6403669724770642,\n \"acc_stderr\": 0.02057523466012378,\n \"acc_norm\": 0.6403669724770642,\n \"acc_norm_stderr\": 0.02057523466012378\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3425925925925926,\n \"acc_stderr\": 0.03236585252602158,\n \"acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.03236585252602158\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6225490196078431,\n \"acc_stderr\": 0.03402272044340703,\n \"acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.03402272044340703\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6624472573839663,\n \"acc_stderr\": 0.03078154910202622,\n \"acc_norm\": 0.6624472573839663,\n \"acc_norm_stderr\": 0.03078154910202622\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.043389203057924,\n \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.043389203057924\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.038956324641389366,\n \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.038956324641389366\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781169,\n \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781169\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n \"acc_stderr\": 0.03035152732334493,\n \"acc_norm\": 0.688034188034188,\n \"acc_norm_stderr\": 0.03035152732334493\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6819923371647509,\n \"acc_stderr\": 0.016653486275615376,\n \"acc_norm\": 0.6819923371647509,\n \"acc_norm_stderr\": 0.016653486275615376\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.026511261369409247,\n \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.026511261369409247\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2860335195530726,\n \"acc_stderr\": 0.015113972129062125,\n \"acc_norm\": 0.2860335195530726,\n \"acc_norm_stderr\": 0.015113972129062125\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.028408302020332687,\n \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.028408302020332687\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5709876543209876,\n \"acc_stderr\": 0.027538925613470863,\n \"acc_norm\": 0.5709876543209876,\n \"acc_norm_stderr\": 0.027538925613470863\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573083,\n \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573083\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36897001303780963,\n \"acc_stderr\": 0.012323936650174862,\n \"acc_norm\": 0.36897001303780963,\n \"acc_norm_stderr\": 0.012323936650174862\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329387,\n \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329387\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.49836601307189543,\n \"acc_stderr\": 0.020227726838150124,\n \"acc_norm\": 0.49836601307189543,\n \"acc_norm_stderr\": 0.020227726838150124\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5142857142857142,\n \"acc_stderr\": 0.03199615232806286,\n \"acc_norm\": 0.5142857142857142,\n \"acc_norm_stderr\": 0.03199615232806286\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.7114427860696517,\n \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.4397590361445783,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25091799265605874,\n \"mc1_stderr\": 0.015176985027707693,\n \"mc2\": 0.3756948163452805,\n \"mc2_stderr\": 0.013646478427855928\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207392\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/haoranxu/ALMA-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-43-59.183673.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["**/details_harness|winogrande|5_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T15-43-59.183673.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T15_43_59.183673", "path": ["results_2023-12-09T15-43-59.183673.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T15-43-59.183673.parquet"]}]}]}
2023-12-09T15:47:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of haoranxu/ALMA-13B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model haoranxu/ALMA-13B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T15:43:59.183673(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of haoranxu/ALMA-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model haoranxu/ALMA-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:43:59.183673(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of haoranxu/ALMA-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model haoranxu/ALMA-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:43:59.183673(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 16, 31, 165, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of haoranxu/ALMA-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model haoranxu/ALMA-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T15:43:59.183673(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
bef26c9b00a63d7bfca7dbd20f5435517e2da19c
# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-7B-v2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/openaccess-ai-collective/DPOpenHermes-7B-v2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [openaccess-ai-collective/DPOpenHermes-7B-v2](https://huggingface.co/openaccess-ai-collective/DPOpenHermes-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-7B-v2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T15:48:02.975332](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-7B-v2/blob/main/results_2023-12-09T15-48-02.975332.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6393858131029347, "acc_stderr": 0.03231519248140217, "acc_norm": 0.6405744963876552, "acc_norm_stderr": 0.032967768680137746, "mc1": 0.412484700122399, "mc1_stderr": 0.01723329939957122, "mc2": 0.5922184046952629, "mc2_stderr": 0.015444038493597899 }, "harness|arc:challenge|25": { "acc": 0.6348122866894198, "acc_stderr": 0.014070265519268802, "acc_norm": 0.6663822525597269, "acc_norm_stderr": 0.013778687054176536 }, "harness|hellaswag|10": { "acc": 0.664708225453097, "acc_stderr": 0.004711275408138421, "acc_norm": 0.8522206731726748, "acc_norm_stderr": 0.0035415582637791008 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.04256193767901408, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.028254200344438665, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.028254200344438665 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.75, "acc_stderr": 0.03621034121889507, "acc_norm": 0.75, "acc_norm_stderr": 0.03621034121889507 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6242774566473989, "acc_stderr": 0.036928207672648664, "acc_norm": 0.6242774566473989, "acc_norm_stderr": 0.036928207672648664 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.03243618636108102, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.03243618636108102 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.02535574126305526, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.02535574126305526 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7612903225806451, "acc_stderr": 0.02425107126220884, "acc_norm": 0.7612903225806451, "acc_norm_stderr": 0.02425107126220884 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8, "acc_stderr": 0.031234752377721175, "acc_norm": 0.8, "acc_norm_stderr": 0.031234752377721175 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.029620227874790492, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.029620227874790492 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8704663212435233, "acc_stderr": 0.024233532297758733, "acc_norm": 0.8704663212435233, "acc_norm_stderr": 0.024233532297758733 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6384615384615384, "acc_stderr": 0.024359581465396997, "acc_norm": 0.6384615384615384, "acc_norm_stderr": 0.024359581465396997 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253255, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6932773109243697, "acc_stderr": 0.029953823891887048, "acc_norm": 0.6932773109243697, "acc_norm_stderr": 0.029953823891887048 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8275229357798165, "acc_stderr": 0.01619780795684804, "acc_norm": 0.8275229357798165, "acc_norm_stderr": 0.01619780795684804 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5509259259259259, "acc_stderr": 0.03392238405321617, "acc_norm": 0.5509259259259259, "acc_norm_stderr": 0.03392238405321617 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7990196078431373, "acc_stderr": 0.028125972265654373, "acc_norm": 0.7990196078431373, "acc_norm_stderr": 0.028125972265654373 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.025955020841621115, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.025955020841621115 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.04236511258094632, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.04236511258094632 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822584, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822584 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.022509033937077805, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.022509033937077805 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8148148148148148, "acc_stderr": 0.013890862162876166, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.013890862162876166 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7167630057803468, "acc_stderr": 0.02425790170532338, "acc_norm": 0.7167630057803468, "acc_norm_stderr": 0.02425790170532338 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3642458100558659, "acc_stderr": 0.016094338768474596, "acc_norm": 0.3642458100558659, "acc_norm_stderr": 0.016094338768474596 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7418300653594772, "acc_stderr": 0.02505850331695814, "acc_norm": 0.7418300653594772, "acc_norm_stderr": 0.02505850331695814 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.026003301117885142, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.026003301117885142 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7407407407407407, "acc_stderr": 0.02438366553103545, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.02438366553103545 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4787234042553192, "acc_stderr": 0.029800481645628693, "acc_norm": 0.4787234042553192, "acc_norm_stderr": 0.029800481645628693 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46740547588005216, "acc_stderr": 0.012743072942653345, "acc_norm": 0.46740547588005216, "acc_norm_stderr": 0.012743072942653345 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7058823529411765, "acc_stderr": 0.027678468642144724, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.027678468642144724 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6879084967320261, "acc_stderr": 0.018745011201277657, "acc_norm": 0.6879084967320261, "acc_norm_stderr": 0.018745011201277657 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.027403859410786845, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.027403859410786845 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.412484700122399, "mc1_stderr": 0.01723329939957122, "mc2": 0.5922184046952629, "mc2_stderr": 0.015444038493597899 }, "harness|winogrande|5": { "acc": 0.7916337805840569, "acc_stderr": 0.011414554399987727 }, "harness|gsm8k|5": { "acc": 0.6360879454131918, "acc_stderr": 0.013252539227966195 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-7B-v2
[ "region:us" ]
2023-12-09T15:50:52+00:00
{"pretty_name": "Evaluation run of openaccess-ai-collective/DPOpenHermes-7B-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [openaccess-ai-collective/DPOpenHermes-7B-v2](https://huggingface.co/openaccess-ai-collective/DPOpenHermes-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-7B-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T15:48:02.975332](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-7B-v2/blob/main/results_2023-12-09T15-48-02.975332.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6393858131029347,\n \"acc_stderr\": 0.03231519248140217,\n \"acc_norm\": 0.6405744963876552,\n \"acc_norm_stderr\": 0.032967768680137746,\n \"mc1\": 0.412484700122399,\n \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5922184046952629,\n \"mc2_stderr\": 0.015444038493597899\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6348122866894198,\n \"acc_stderr\": 0.014070265519268802,\n \"acc_norm\": 0.6663822525597269,\n \"acc_norm_stderr\": 0.013778687054176536\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.664708225453097,\n \"acc_stderr\": 0.004711275408138421,\n \"acc_norm\": 0.8522206731726748,\n \"acc_norm_stderr\": 0.0035415582637791008\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438665,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438665\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305526,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305526\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790492,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790492\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396997,\n \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396997\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887048,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887048\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.01619780795684804,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.01619780795684804\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.013890862162876166,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.013890862162876166\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3642458100558659,\n \"acc_stderr\": 0.016094338768474596,\n \"acc_norm\": 0.3642458100558659,\n \"acc_norm_stderr\": 0.016094338768474596\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653345,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653345\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144724,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144724\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6879084967320261,\n \"acc_stderr\": 0.018745011201277657,\n \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.018745011201277657\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.412484700122399,\n \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5922184046952629,\n \"mc2_stderr\": 0.015444038493597899\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987727\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6360879454131918,\n \"acc_stderr\": 0.013252539227966195\n }\n}\n```", "repo_url": "https://huggingface.co/openaccess-ai-collective/DPOpenHermes-7B-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-48-02.975332.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["**/details_harness|winogrande|5_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T15-48-02.975332.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T15_48_02.975332", "path": ["results_2023-12-09T15-48-02.975332.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T15-48-02.975332.parquet"]}]}]}
2023-12-09T15:51:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-7B-v2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model openaccess-ai-collective/DPOpenHermes-7B-v2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T15:48:02.975332(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-7B-v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/DPOpenHermes-7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:48:02.975332(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-7B-v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/DPOpenHermes-7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:48:02.975332(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 175, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-7B-v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/DPOpenHermes-7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T15:48:02.975332(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
271857ec673a1f6e83fc94053cfa2d9a821ee387
# Dataset Card for Evaluation run of uukuguy/speechless-coding-7b-16k-tora ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/uukuguy/speechless-coding-7b-16k-tora - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [uukuguy/speechless-coding-7b-16k-tora](https://huggingface.co/uukuguy/speechless-coding-7b-16k-tora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-coding-7b-16k-tora", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T15:50:40.789199](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-coding-7b-16k-tora/blob/main/results_2023-12-09T15-50-40.789199.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.3931109615254218, "acc_stderr": 0.03416544865753528, "acc_norm": 0.3960835606892354, "acc_norm_stderr": 0.03491838760794626, "mc1": 0.29008567931456547, "mc1_stderr": 0.01588623687420952, "mc2": 0.4490702414317695, "mc2_stderr": 0.01493086789491207 }, "harness|arc:challenge|25": { "acc": 0.37457337883959047, "acc_stderr": 0.014144193471893446, "acc_norm": 0.4121160409556314, "acc_norm_stderr": 0.0143839153022254 }, "harness|hellaswag|10": { "acc": 0.4838677554272057, "acc_stderr": 0.004987183560792758, "acc_norm": 0.6444931288587931, "acc_norm_stderr": 0.004776883632722618 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542129, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542129 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.35555555555555557, "acc_stderr": 0.04135176749720386, "acc_norm": 0.35555555555555557, "acc_norm_stderr": 0.04135176749720386 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.32894736842105265, "acc_stderr": 0.03823428969926604, "acc_norm": 0.32894736842105265, "acc_norm_stderr": 0.03823428969926604 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.39622641509433965, "acc_stderr": 0.030102793781791194, "acc_norm": 0.39622641509433965, "acc_norm_stderr": 0.030102793781791194 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.3472222222222222, "acc_stderr": 0.039812405437178615, "acc_norm": 0.3472222222222222, "acc_norm_stderr": 0.039812405437178615 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.27, "acc_stderr": 0.04461960433384739, "acc_norm": 0.27, "acc_norm_stderr": 0.04461960433384739 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.3352601156069364, "acc_stderr": 0.03599586301247077, "acc_norm": 0.3352601156069364, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2549019607843137, "acc_stderr": 0.043364327079931785, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.043364327079931785 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.59, "acc_stderr": 0.04943110704237101, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237101 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.2936170212765957, "acc_stderr": 0.029771642712491227, "acc_norm": 0.2936170212765957, "acc_norm_stderr": 0.029771642712491227 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.04185774424022056, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.04185774424022056 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.3448275862068966, "acc_stderr": 0.03960933549451208, "acc_norm": 0.3448275862068966, "acc_norm_stderr": 0.03960933549451208 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2751322751322751, "acc_stderr": 0.023000086859068656, "acc_norm": 0.2751322751322751, "acc_norm_stderr": 0.023000086859068656 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2619047619047619, "acc_stderr": 0.03932537680392871, "acc_norm": 0.2619047619047619, "acc_norm_stderr": 0.03932537680392871 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3903225806451613, "acc_stderr": 0.027751256636969576, "acc_norm": 0.3903225806451613, "acc_norm_stderr": 0.027751256636969576 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3497536945812808, "acc_stderr": 0.03355400904969566, "acc_norm": 0.3497536945812808, "acc_norm_stderr": 0.03355400904969566 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.509090909090909, "acc_stderr": 0.03903698647748441, "acc_norm": 0.509090909090909, "acc_norm_stderr": 0.03903698647748441 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.47474747474747475, "acc_stderr": 0.03557806245087314, "acc_norm": 0.47474747474747475, "acc_norm_stderr": 0.03557806245087314 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.46113989637305697, "acc_stderr": 0.03597524411734578, "acc_norm": 0.46113989637305697, "acc_norm_stderr": 0.03597524411734578 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.3717948717948718, "acc_stderr": 0.024503472557110946, "acc_norm": 0.3717948717948718, "acc_norm_stderr": 0.024503472557110946 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2777777777777778, "acc_stderr": 0.027309140588230186, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.027309140588230186 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3949579831932773, "acc_stderr": 0.03175367846096624, "acc_norm": 0.3949579831932773, "acc_norm_stderr": 0.03175367846096624 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.23178807947019867, "acc_stderr": 0.03445406271987054, "acc_norm": 0.23178807947019867, "acc_norm_stderr": 0.03445406271987054 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.42935779816513764, "acc_stderr": 0.021222286397236508, "acc_norm": 0.42935779816513764, "acc_norm_stderr": 0.021222286397236508 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.27314814814814814, "acc_stderr": 0.03038805130167812, "acc_norm": 0.27314814814814814, "acc_norm_stderr": 0.03038805130167812 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.4411764705882353, "acc_stderr": 0.03484941514429231, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.03484941514429231 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.5569620253164557, "acc_stderr": 0.032335327775334835, "acc_norm": 0.5569620253164557, "acc_norm_stderr": 0.032335327775334835 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.4304932735426009, "acc_stderr": 0.033231973029429394, "acc_norm": 0.4304932735426009, "acc_norm_stderr": 0.033231973029429394 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.4732824427480916, "acc_stderr": 0.04379024936553894, "acc_norm": 0.4732824427480916, "acc_norm_stderr": 0.04379024936553894 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5537190082644629, "acc_stderr": 0.04537935177947879, "acc_norm": 0.5537190082644629, "acc_norm_stderr": 0.04537935177947879 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.49074074074074076, "acc_stderr": 0.04832853553437055, "acc_norm": 0.49074074074074076, "acc_norm_stderr": 0.04832853553437055 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.39263803680981596, "acc_stderr": 0.03836740907831029, "acc_norm": 0.39263803680981596, "acc_norm_stderr": 0.03836740907831029 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.29464285714285715, "acc_stderr": 0.043270409325787296, "acc_norm": 0.29464285714285715, "acc_norm_stderr": 0.043270409325787296 }, "harness|hendrycksTest-management|5": { "acc": 0.5048543689320388, "acc_stderr": 0.049505043821289195, "acc_norm": 0.5048543689320388, "acc_norm_stderr": 0.049505043821289195 }, "harness|hendrycksTest-marketing|5": { "acc": 0.594017094017094, "acc_stderr": 0.03217180182641087, "acc_norm": 0.594017094017094, "acc_norm_stderr": 0.03217180182641087 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.4840357598978289, "acc_stderr": 0.01787084750608173, "acc_norm": 0.4840357598978289, "acc_norm_stderr": 0.01787084750608173 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.4653179190751445, "acc_stderr": 0.02685425792825889, "acc_norm": 0.4653179190751445, "acc_norm_stderr": 0.02685425792825889 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.25139664804469275, "acc_stderr": 0.014508979453553984, "acc_norm": 0.25139664804469275, "acc_norm_stderr": 0.014508979453553984 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.3954248366013072, "acc_stderr": 0.027996723180631455, "acc_norm": 0.3954248366013072, "acc_norm_stderr": 0.027996723180631455 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.44694533762057875, "acc_stderr": 0.028237769422085324, "acc_norm": 0.44694533762057875, "acc_norm_stderr": 0.028237769422085324 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.42592592592592593, "acc_stderr": 0.027513747284379428, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.027513747284379428 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3191489361702128, "acc_stderr": 0.027807990141320193, "acc_norm": 0.3191489361702128, "acc_norm_stderr": 0.027807990141320193 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3122555410691004, "acc_stderr": 0.011835798135683182, "acc_norm": 0.3122555410691004, "acc_norm_stderr": 0.011835798135683182 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.3125, "acc_stderr": 0.02815637344037142, "acc_norm": 0.3125, "acc_norm_stderr": 0.02815637344037142 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.3627450980392157, "acc_stderr": 0.019450768432505514, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.019450768432505514 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5363636363636364, "acc_stderr": 0.04776449162396197, "acc_norm": 0.5363636363636364, "acc_norm_stderr": 0.04776449162396197 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.46938775510204084, "acc_stderr": 0.031949171367580624, "acc_norm": 0.46938775510204084, "acc_norm_stderr": 0.031949171367580624 }, "harness|hendrycksTest-sociology|5": { "acc": 0.47761194029850745, "acc_stderr": 0.035319879302087305, "acc_norm": 0.47761194029850745, "acc_norm_stderr": 0.035319879302087305 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-virology|5": { "acc": 0.3795180722891566, "acc_stderr": 0.037777988227480165, "acc_norm": 0.3795180722891566, "acc_norm_stderr": 0.037777988227480165 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.391812865497076, "acc_stderr": 0.03743979825926401, "acc_norm": 0.391812865497076, "acc_norm_stderr": 0.03743979825926401 }, "harness|truthfulqa:mc|0": { "mc1": 0.29008567931456547, "mc1_stderr": 0.01588623687420952, "mc2": 0.4490702414317695, "mc2_stderr": 0.01493086789491207 }, "harness|winogrande|5": { "acc": 0.6361483820047356, "acc_stderr": 0.013521488896883415 }, "harness|gsm8k|5": { "acc": 0.1728582259287339, "acc_stderr": 0.010415432246200566 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_uukuguy__speechless-coding-7b-16k-tora
[ "region:us" ]
2023-12-09T15:53:37+00:00
{"pretty_name": "Evaluation run of uukuguy/speechless-coding-7b-16k-tora", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/speechless-coding-7b-16k-tora](https://huggingface.co/uukuguy/speechless-coding-7b-16k-tora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-coding-7b-16k-tora\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T15:50:40.789199](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-coding-7b-16k-tora/blob/main/results_2023-12-09T15-50-40.789199.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3931109615254218,\n \"acc_stderr\": 0.03416544865753528,\n \"acc_norm\": 0.3960835606892354,\n \"acc_norm_stderr\": 0.03491838760794626,\n \"mc1\": 0.29008567931456547,\n \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4490702414317695,\n \"mc2_stderr\": 0.01493086789491207\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.37457337883959047,\n \"acc_stderr\": 0.014144193471893446,\n \"acc_norm\": 0.4121160409556314,\n \"acc_norm_stderr\": 0.0143839153022254\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4838677554272057,\n \"acc_stderr\": 0.004987183560792758,\n \"acc_norm\": 0.6444931288587931,\n \"acc_norm_stderr\": 0.004776883632722618\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.03823428969926604,\n \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.03823428969926604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.39622641509433965,\n \"acc_stderr\": 0.030102793781791194,\n \"acc_norm\": 0.39622641509433965,\n \"acc_norm_stderr\": 0.030102793781791194\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.3352601156069364,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2936170212765957,\n \"acc_stderr\": 0.029771642712491227,\n \"acc_norm\": 0.2936170212765957,\n \"acc_norm_stderr\": 0.029771642712491227\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03960933549451208,\n \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03960933549451208\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2751322751322751,\n \"acc_stderr\": 0.023000086859068656,\n \"acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.023000086859068656\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3903225806451613,\n \"acc_stderr\": 0.027751256636969576,\n \"acc_norm\": 0.3903225806451613,\n \"acc_norm_stderr\": 0.027751256636969576\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.509090909090909,\n \"acc_stderr\": 0.03903698647748441,\n \"acc_norm\": 0.509090909090909,\n \"acc_norm_stderr\": 0.03903698647748441\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.47474747474747475,\n \"acc_stderr\": 0.03557806245087314,\n \"acc_norm\": 0.47474747474747475,\n \"acc_norm_stderr\": 0.03557806245087314\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.46113989637305697,\n \"acc_stderr\": 0.03597524411734578,\n \"acc_norm\": 0.46113989637305697,\n \"acc_norm_stderr\": 0.03597524411734578\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3717948717948718,\n \"acc_stderr\": 0.024503472557110946,\n \"acc_norm\": 0.3717948717948718,\n \"acc_norm_stderr\": 0.024503472557110946\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230186,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230186\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3949579831932773,\n \"acc_stderr\": 0.03175367846096624,\n \"acc_norm\": 0.3949579831932773,\n \"acc_norm_stderr\": 0.03175367846096624\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987054,\n \"acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987054\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.42935779816513764,\n \"acc_stderr\": 0.021222286397236508,\n \"acc_norm\": 0.42935779816513764,\n \"acc_norm_stderr\": 0.021222286397236508\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.03484941514429231,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.03484941514429231\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5569620253164557,\n \"acc_stderr\": 0.032335327775334835,\n \"acc_norm\": 0.5569620253164557,\n \"acc_norm_stderr\": 0.032335327775334835\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4304932735426009,\n \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.4304932735426009,\n \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4732824427480916,\n \"acc_stderr\": 0.04379024936553894,\n \"acc_norm\": 0.4732824427480916,\n \"acc_norm_stderr\": 0.04379024936553894\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5537190082644629,\n \"acc_stderr\": 0.04537935177947879,\n \"acc_norm\": 0.5537190082644629,\n \"acc_norm_stderr\": 0.04537935177947879\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.39263803680981596,\n \"acc_stderr\": 0.03836740907831029,\n \"acc_norm\": 0.39263803680981596,\n \"acc_norm_stderr\": 0.03836740907831029\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5048543689320388,\n \"acc_stderr\": 0.049505043821289195,\n \"acc_norm\": 0.5048543689320388,\n \"acc_norm_stderr\": 0.049505043821289195\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.594017094017094,\n \"acc_stderr\": 0.03217180182641087,\n \"acc_norm\": 0.594017094017094,\n \"acc_norm_stderr\": 0.03217180182641087\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4840357598978289,\n \"acc_stderr\": 0.01787084750608173,\n \"acc_norm\": 0.4840357598978289,\n \"acc_norm_stderr\": 0.01787084750608173\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4653179190751445,\n \"acc_stderr\": 0.02685425792825889,\n \"acc_norm\": 0.4653179190751445,\n \"acc_norm_stderr\": 0.02685425792825889\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n \"acc_stderr\": 0.014508979453553984,\n \"acc_norm\": 0.25139664804469275,\n \"acc_norm_stderr\": 0.014508979453553984\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3954248366013072,\n \"acc_stderr\": 0.027996723180631455,\n \"acc_norm\": 0.3954248366013072,\n \"acc_norm_stderr\": 0.027996723180631455\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.44694533762057875,\n \"acc_stderr\": 0.028237769422085324,\n \"acc_norm\": 0.44694533762057875,\n \"acc_norm_stderr\": 0.028237769422085324\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.027513747284379428,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.027513747284379428\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.027807990141320193,\n \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.027807990141320193\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3122555410691004,\n \"acc_stderr\": 0.011835798135683182,\n \"acc_norm\": 0.3122555410691004,\n \"acc_norm_stderr\": 0.011835798135683182\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.019450768432505514,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.019450768432505514\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.46938775510204084,\n \"acc_stderr\": 0.031949171367580624,\n \"acc_norm\": 0.46938775510204084,\n \"acc_norm_stderr\": 0.031949171367580624\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.47761194029850745,\n \"acc_stderr\": 0.035319879302087305,\n \"acc_norm\": 0.47761194029850745,\n \"acc_norm_stderr\": 0.035319879302087305\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n \"acc_stderr\": 0.037777988227480165,\n \"acc_norm\": 0.3795180722891566,\n \"acc_norm_stderr\": 0.037777988227480165\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.391812865497076,\n \"acc_stderr\": 0.03743979825926401,\n \"acc_norm\": 0.391812865497076,\n \"acc_norm_stderr\": 0.03743979825926401\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4490702414317695,\n \"mc2_stderr\": 0.01493086789491207\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6361483820047356,\n \"acc_stderr\": 0.013521488896883415\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1728582259287339,\n \"acc_stderr\": 0.010415432246200566\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/speechless-coding-7b-16k-tora", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-50-40.789199.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["**/details_harness|winogrande|5_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T15-50-40.789199.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T15_50_40.789199", "path": ["results_2023-12-09T15-50-40.789199.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T15-50-40.789199.parquet"]}]}]}
2023-12-09T15:54:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of uukuguy/speechless-coding-7b-16k-tora ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model uukuguy/speechless-coding-7b-16k-tora on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T15:50:40.789199(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of uukuguy/speechless-coding-7b-16k-tora", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-coding-7b-16k-tora on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:50:40.789199(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of uukuguy/speechless-coding-7b-16k-tora", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-coding-7b-16k-tora on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:50:40.789199(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 175, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of uukuguy/speechless-coding-7b-16k-tora## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-coding-7b-16k-tora on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T15:50:40.789199(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e78efd58ab8ab99db8ab6be9f8a6dd4fa2b94c58
# Dataset Card for Evaluation run of v1olet/v1olet_mistral_7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/v1olet/v1olet_mistral_7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [v1olet/v1olet_mistral_7B](https://huggingface.co/v1olet/v1olet_mistral_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_v1olet__v1olet_mistral_7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T15:51:44.000216](https://huggingface.co/datasets/open-llm-leaderboard/details_v1olet__v1olet_mistral_7B/blob/main/results_2023-12-09T15-51-44.000216.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.26155738386066046, "acc_stderr": 0.03127934667718085, "acc_norm": 0.26322679499270923, "acc_norm_stderr": 0.032116347963256846, "mc1": 0.2386780905752754, "mc1_stderr": 0.014922629695456421, "mc2": NaN, "mc2_stderr": NaN }, "harness|arc:challenge|25": { "acc": 0.24146757679180889, "acc_stderr": 0.01250656483973943, "acc_norm": 0.29180887372013653, "acc_norm_stderr": 0.013284525292403506 }, "harness|hellaswag|10": { "acc": 0.2621987651862179, "acc_stderr": 0.00438931274801215, "acc_norm": 0.2813184624576778, "acc_norm_stderr": 0.004487235657955673 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.25925925925925924, "acc_stderr": 0.037857144650666544, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.037857144650666544 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.2631578947368421, "acc_stderr": 0.035834961763610645, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.035834961763610645 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2641509433962264, "acc_stderr": 0.027134291628741716, "acc_norm": 0.2641509433962264, "acc_norm_stderr": 0.027134291628741716 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.24305555555555555, "acc_stderr": 0.03586879280080341, "acc_norm": 0.24305555555555555, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.23, "acc_stderr": 0.042295258468165065, "acc_norm": 0.23, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.24277456647398843, "acc_stderr": 0.0326926380614177, "acc_norm": 0.24277456647398843, "acc_norm_stderr": 0.0326926380614177 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.04220773659171453, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.04220773659171453 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.28085106382978725, "acc_stderr": 0.029379170464124818, "acc_norm": 0.28085106382978725, "acc_norm_stderr": 0.029379170464124818 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.04185774424022056, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.04185774424022056 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24338624338624337, "acc_stderr": 0.022101128787415415, "acc_norm": 0.24338624338624337, "acc_norm_stderr": 0.022101128787415415 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2619047619047619, "acc_stderr": 0.0393253768039287, "acc_norm": 0.2619047619047619, "acc_norm_stderr": 0.0393253768039287 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.22, "acc_stderr": 0.04163331998932269, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932269 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.25161290322580643, "acc_stderr": 0.024685979286239956, "acc_norm": 0.25161290322580643, "acc_norm_stderr": 0.024685979286239956 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.1921182266009852, "acc_stderr": 0.027719315709614775, "acc_norm": 0.1921182266009852, "acc_norm_stderr": 0.027719315709614775 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2909090909090909, "acc_stderr": 0.03546563019624336, "acc_norm": 0.2909090909090909, "acc_norm_stderr": 0.03546563019624336 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.2727272727272727, "acc_stderr": 0.03173071239071724, "acc_norm": 0.2727272727272727, "acc_norm_stderr": 0.03173071239071724 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.2694300518134715, "acc_stderr": 0.032018671228777947, "acc_norm": 0.2694300518134715, "acc_norm_stderr": 0.032018671228777947 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.23076923076923078, "acc_stderr": 0.02136202772522272, "acc_norm": 0.23076923076923078, "acc_norm_stderr": 0.02136202772522272 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.026842057873833706, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.026842057873833706 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.23949579831932774, "acc_stderr": 0.02772206549336127, "acc_norm": 0.23949579831932774, "acc_norm_stderr": 0.02772206549336127 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2251655629139073, "acc_stderr": 0.03410435282008936, "acc_norm": 0.2251655629139073, "acc_norm_stderr": 0.03410435282008936 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.27706422018348625, "acc_stderr": 0.019188482590169538, "acc_norm": 0.27706422018348625, "acc_norm_stderr": 0.019188482590169538 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.32407407407407407, "acc_stderr": 0.03191923445686185, "acc_norm": 0.32407407407407407, "acc_norm_stderr": 0.03191923445686185 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.3333333333333333, "acc_stderr": 0.03308611113236434, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.03308611113236434 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.2320675105485232, "acc_stderr": 0.027479744550808514, "acc_norm": 0.2320675105485232, "acc_norm_stderr": 0.027479744550808514 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.18834080717488788, "acc_stderr": 0.026241132996407266, "acc_norm": 0.18834080717488788, "acc_norm_stderr": 0.026241132996407266 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2824427480916031, "acc_stderr": 0.03948406125768361, "acc_norm": 0.2824427480916031, "acc_norm_stderr": 0.03948406125768361 }, "harness|hendrycksTest-international_law|5": { "acc": 0.3140495867768595, "acc_stderr": 0.04236964753041017, "acc_norm": 0.3140495867768595, "acc_norm_stderr": 0.04236964753041017 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.04236511258094631, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.04236511258094631 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.26380368098159507, "acc_stderr": 0.03462419931615623, "acc_norm": 0.26380368098159507, "acc_norm_stderr": 0.03462419931615623 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.23214285714285715, "acc_stderr": 0.040073418097558045, "acc_norm": 0.23214285714285715, "acc_norm_stderr": 0.040073418097558045 }, "harness|hendrycksTest-management|5": { "acc": 0.32038834951456313, "acc_stderr": 0.046202840822800406, "acc_norm": 0.32038834951456313, "acc_norm_stderr": 0.046202840822800406 }, "harness|hendrycksTest-marketing|5": { "acc": 0.24786324786324787, "acc_stderr": 0.0282863240755644, "acc_norm": 0.24786324786324787, "acc_norm_stderr": 0.0282863240755644 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.24010217113665389, "acc_stderr": 0.015274685213734188, "acc_norm": 0.24010217113665389, "acc_norm_stderr": 0.015274685213734188 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.28901734104046245, "acc_stderr": 0.024405173935783238, "acc_norm": 0.28901734104046245, "acc_norm_stderr": 0.024405173935783238 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2558659217877095, "acc_stderr": 0.014593620923210732, "acc_norm": 0.2558659217877095, "acc_norm_stderr": 0.014593620923210732 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.2679738562091503, "acc_stderr": 0.025360603796242553, "acc_norm": 0.2679738562091503, "acc_norm_stderr": 0.025360603796242553 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.21221864951768488, "acc_stderr": 0.02322275679743512, "acc_norm": 0.21221864951768488, "acc_norm_stderr": 0.02322275679743512 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.25925925925925924, "acc_stderr": 0.024383665531035464, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.024383665531035464 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.30851063829787234, "acc_stderr": 0.027553366165101362, "acc_norm": 0.30851063829787234, "acc_norm_stderr": 0.027553366165101362 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2516297262059974, "acc_stderr": 0.011083276280441904, "acc_norm": 0.2516297262059974, "acc_norm_stderr": 0.011083276280441904 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.28308823529411764, "acc_stderr": 0.02736586113151381, "acc_norm": 0.28308823529411764, "acc_norm_stderr": 0.02736586113151381 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2761437908496732, "acc_stderr": 0.018087276935663133, "acc_norm": 0.2761437908496732, "acc_norm_stderr": 0.018087276935663133 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.24545454545454545, "acc_stderr": 0.041220665028782855, "acc_norm": 0.24545454545454545, "acc_norm_stderr": 0.041220665028782855 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.27755102040816326, "acc_stderr": 0.02866685779027465, "acc_norm": 0.27755102040816326, "acc_norm_stderr": 0.02866685779027465 }, "harness|hendrycksTest-sociology|5": { "acc": 0.22885572139303484, "acc_stderr": 0.029705284056772436, "acc_norm": 0.22885572139303484, "acc_norm_stderr": 0.029705284056772436 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.34, "acc_stderr": 0.047609522856952365, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952365 }, "harness|hendrycksTest-virology|5": { "acc": 0.27710843373493976, "acc_stderr": 0.03484331592680588, "acc_norm": 0.27710843373493976, "acc_norm_stderr": 0.03484331592680588 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2573099415204678, "acc_stderr": 0.03352799844161865, "acc_norm": 0.2573099415204678, "acc_norm_stderr": 0.03352799844161865 }, "harness|truthfulqa:mc|0": { "mc1": 0.2386780905752754, "mc1_stderr": 0.014922629695456421, "mc2": NaN, "mc2_stderr": NaN }, "harness|winogrande|5": { "acc": 0.4940805051302289, "acc_stderr": 0.014051500838485807 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_v1olet__v1olet_mistral_7B
[ "region:us" ]
2023-12-09T15:54:36+00:00
{"pretty_name": "Evaluation run of v1olet/v1olet_mistral_7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [v1olet/v1olet_mistral_7B](https://huggingface.co/v1olet/v1olet_mistral_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_v1olet__v1olet_mistral_7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T15:51:44.000216](https://huggingface.co/datasets/open-llm-leaderboard/details_v1olet__v1olet_mistral_7B/blob/main/results_2023-12-09T15-51-44.000216.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26155738386066046,\n \"acc_stderr\": 0.03127934667718085,\n \"acc_norm\": 0.26322679499270923,\n \"acc_norm_stderr\": 0.032116347963256846,\n \"mc1\": 0.2386780905752754,\n \"mc1_stderr\": 0.014922629695456421,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.24146757679180889,\n \"acc_stderr\": 0.01250656483973943,\n \"acc_norm\": 0.29180887372013653,\n \"acc_norm_stderr\": 0.013284525292403506\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2621987651862179,\n \"acc_stderr\": 0.00438931274801215,\n \"acc_norm\": 0.2813184624576778,\n \"acc_norm_stderr\": 0.004487235657955673\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.037857144650666544,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.037857144650666544\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.035834961763610645,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.035834961763610645\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.027134291628741716,\n \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.027134291628741716\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.029379170464124818,\n \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.029379170464124818\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24338624338624337,\n \"acc_stderr\": 0.022101128787415415,\n \"acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.022101128787415415\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.1921182266009852,\n \"acc_stderr\": 0.027719315709614775,\n \"acc_norm\": 0.1921182266009852,\n \"acc_norm_stderr\": 0.027719315709614775\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.2694300518134715,\n \"acc_stderr\": 0.032018671228777947,\n \"acc_norm\": 0.2694300518134715,\n \"acc_norm_stderr\": 0.032018671228777947\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.02136202772522272,\n \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.02136202772522272\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.02772206549336127,\n \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.02772206549336127\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.27706422018348625,\n \"acc_stderr\": 0.019188482590169538,\n \"acc_norm\": 0.27706422018348625,\n \"acc_norm_stderr\": 0.019188482590169538\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686185,\n \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686185\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03308611113236434,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03308611113236434\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2320675105485232,\n \"acc_stderr\": 0.027479744550808514,\n \"acc_norm\": 0.2320675105485232,\n \"acc_norm_stderr\": 0.027479744550808514\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.18834080717488788,\n \"acc_stderr\": 0.026241132996407266,\n \"acc_norm\": 0.18834080717488788,\n \"acc_norm_stderr\": 0.026241132996407266\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.3140495867768595,\n \"acc_stderr\": 0.04236964753041017,\n \"acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.04236964753041017\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.04236511258094631,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.04236511258094631\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.040073418097558045,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.040073418097558045\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.046202840822800406,\n \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.046202840822800406\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24786324786324787,\n \"acc_stderr\": 0.0282863240755644,\n \"acc_norm\": 0.24786324786324787,\n \"acc_norm_stderr\": 0.0282863240755644\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24010217113665389,\n \"acc_stderr\": 0.015274685213734188,\n \"acc_norm\": 0.24010217113665389,\n \"acc_norm_stderr\": 0.015274685213734188\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.28901734104046245,\n \"acc_stderr\": 0.024405173935783238,\n \"acc_norm\": 0.28901734104046245,\n \"acc_norm_stderr\": 0.024405173935783238\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n \"acc_stderr\": 0.014593620923210732,\n \"acc_norm\": 0.2558659217877095,\n \"acc_norm_stderr\": 0.014593620923210732\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.025360603796242553,\n \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.025360603796242553\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21221864951768488,\n \"acc_stderr\": 0.02322275679743512,\n \"acc_norm\": 0.21221864951768488,\n \"acc_norm_stderr\": 0.02322275679743512\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.024383665531035464,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.024383665531035464\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.30851063829787234,\n \"acc_stderr\": 0.027553366165101362,\n \"acc_norm\": 0.30851063829787234,\n \"acc_norm_stderr\": 0.027553366165101362\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2516297262059974,\n \"acc_stderr\": 0.011083276280441904,\n \"acc_norm\": 0.2516297262059974,\n \"acc_norm_stderr\": 0.011083276280441904\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.28308823529411764,\n \"acc_stderr\": 0.02736586113151381,\n \"acc_norm\": 0.28308823529411764,\n \"acc_norm_stderr\": 0.02736586113151381\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2761437908496732,\n \"acc_stderr\": 0.018087276935663133,\n \"acc_norm\": 0.2761437908496732,\n \"acc_norm_stderr\": 0.018087276935663133\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.24545454545454545,\n \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.27755102040816326,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.27755102040816326,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.22885572139303484,\n \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n \"acc_stderr\": 0.03484331592680588,\n \"acc_norm\": 0.27710843373493976,\n \"acc_norm_stderr\": 0.03484331592680588\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2573099415204678,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.2573099415204678,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n \"mc1_stderr\": 0.014922629695456421,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4940805051302289,\n \"acc_stderr\": 0.014051500838485807\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/v1olet/v1olet_mistral_7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-51-44.000216.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["**/details_harness|winogrande|5_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T15-51-44.000216.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T15_51_44.000216", "path": ["results_2023-12-09T15-51-44.000216.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T15-51-44.000216.parquet"]}]}]}
2023-12-09T15:55:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of v1olet/v1olet_mistral_7B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model v1olet/v1olet_mistral_7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T15:51:44.000216(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of v1olet/v1olet_mistral_7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model v1olet/v1olet_mistral_7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:51:44.000216(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of v1olet/v1olet_mistral_7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model v1olet/v1olet_mistral_7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:51:44.000216(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 173, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of v1olet/v1olet_mistral_7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model v1olet/v1olet_mistral_7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T15:51:44.000216(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1cf0213189a7b4f71d0b36333a2c65f7255492eb
# Dataset Card for Evaluation run of hfl/chinese-alpaca-2-13b-16k ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/hfl/chinese-alpaca-2-13b-16k - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [hfl/chinese-alpaca-2-13b-16k](https://huggingface.co/hfl/chinese-alpaca-2-13b-16k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b-16k", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T15:53:33.265685](https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b-16k/blob/main/results_2023-12-09T15-53-33.265685.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5126179344828111, "acc_stderr": 0.0342051274120513, "acc_norm": 0.5178843368987507, "acc_norm_stderr": 0.034949756392914415, "mc1": 0.33047735618115054, "mc1_stderr": 0.016466769613698307, "mc2": 0.46496694797516, "mc2_stderr": 0.015236674932834036 }, "harness|arc:challenge|25": { "acc": 0.5213310580204779, "acc_stderr": 0.014598087973127106, "acc_norm": 0.5503412969283277, "acc_norm_stderr": 0.014537144444284738 }, "harness|hellaswag|10": { "acc": 0.5728938458474407, "acc_stderr": 0.004936470085238487, "acc_norm": 0.7741485759808803, "acc_norm_stderr": 0.0041728722829842005 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4666666666666667, "acc_stderr": 0.043097329010363554, "acc_norm": 0.4666666666666667, "acc_norm_stderr": 0.043097329010363554 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5131578947368421, "acc_stderr": 0.04067533136309173, "acc_norm": 0.5131578947368421, "acc_norm_stderr": 0.04067533136309173 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5509433962264151, "acc_stderr": 0.030612730713641095, "acc_norm": 0.5509433962264151, "acc_norm_stderr": 0.030612730713641095 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5069444444444444, "acc_stderr": 0.04180806750294938, "acc_norm": 0.5069444444444444, "acc_norm_stderr": 0.04180806750294938 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5086705202312138, "acc_stderr": 0.038118909889404126, "acc_norm": 0.5086705202312138, "acc_norm_stderr": 0.038118909889404126 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.29411764705882354, "acc_stderr": 0.04533838195929775, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.04533838195929775 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3446808510638298, "acc_stderr": 0.03106898596312215, "acc_norm": 0.3446808510638298, "acc_norm_stderr": 0.03106898596312215 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.04185774424022056, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.04185774424022056 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.503448275862069, "acc_stderr": 0.04166567577101579, "acc_norm": 0.503448275862069, "acc_norm_stderr": 0.04166567577101579 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.30952380952380953, "acc_stderr": 0.023809523809523853, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.023809523809523853 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.31746031746031744, "acc_stderr": 0.041634530313028585, "acc_norm": 0.31746031746031744, "acc_norm_stderr": 0.041634530313028585 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5741935483870968, "acc_stderr": 0.028129112709165904, "acc_norm": 0.5741935483870968, "acc_norm_stderr": 0.028129112709165904 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.39901477832512317, "acc_stderr": 0.03445487686264715, "acc_norm": 0.39901477832512317, "acc_norm_stderr": 0.03445487686264715 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6606060606060606, "acc_stderr": 0.03697442205031596, "acc_norm": 0.6606060606060606, "acc_norm_stderr": 0.03697442205031596 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6666666666666666, "acc_stderr": 0.033586181457325226, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.033586181457325226 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7357512953367875, "acc_stderr": 0.03182155050916646, "acc_norm": 0.7357512953367875, "acc_norm_stderr": 0.03182155050916646 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.44358974358974357, "acc_stderr": 0.0251891498947642, "acc_norm": 0.44358974358974357, "acc_norm_stderr": 0.0251891498947642 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3074074074074074, "acc_stderr": 0.02813325257881563, "acc_norm": 0.3074074074074074, "acc_norm_stderr": 0.02813325257881563 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5210084033613446, "acc_stderr": 0.03244980849990029, "acc_norm": 0.5210084033613446, "acc_norm_stderr": 0.03244980849990029 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.038227469376587525, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.038227469376587525 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7045871559633028, "acc_stderr": 0.019560619182976, "acc_norm": 0.7045871559633028, "acc_norm_stderr": 0.019560619182976 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.39814814814814814, "acc_stderr": 0.033384734032074016, "acc_norm": 0.39814814814814814, "acc_norm_stderr": 0.033384734032074016 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7009803921568627, "acc_stderr": 0.03213325717373617, "acc_norm": 0.7009803921568627, "acc_norm_stderr": 0.03213325717373617 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.70042194092827, "acc_stderr": 0.02981802474975309, "acc_norm": 0.70042194092827, "acc_norm_stderr": 0.02981802474975309 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.600896860986547, "acc_stderr": 0.03286745312567961, "acc_norm": 0.600896860986547, "acc_norm_stderr": 0.03286745312567961 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5648854961832062, "acc_stderr": 0.04348208051644858, "acc_norm": 0.5648854961832062, "acc_norm_stderr": 0.04348208051644858 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7355371900826446, "acc_stderr": 0.04026187527591207, "acc_norm": 0.7355371900826446, "acc_norm_stderr": 0.04026187527591207 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04557239513497751, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04557239513497751 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5766871165644172, "acc_stderr": 0.03881891213334383, "acc_norm": 0.5766871165644172, "acc_norm_stderr": 0.03881891213334383 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.29464285714285715, "acc_stderr": 0.04327040932578729, "acc_norm": 0.29464285714285715, "acc_norm_stderr": 0.04327040932578729 }, "harness|hendrycksTest-management|5": { "acc": 0.6796116504854369, "acc_stderr": 0.04620284082280042, "acc_norm": 0.6796116504854369, "acc_norm_stderr": 0.04620284082280042 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7863247863247863, "acc_stderr": 0.02685345037700914, "acc_norm": 0.7863247863247863, "acc_norm_stderr": 0.02685345037700914 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956914, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956914 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7151979565772669, "acc_stderr": 0.016139174096522546, "acc_norm": 0.7151979565772669, "acc_norm_stderr": 0.016139174096522546 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5867052023121387, "acc_stderr": 0.02651126136940924, "acc_norm": 0.5867052023121387, "acc_norm_stderr": 0.02651126136940924 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24134078212290502, "acc_stderr": 0.014310999547961443, "acc_norm": 0.24134078212290502, "acc_norm_stderr": 0.014310999547961443 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5522875816993464, "acc_stderr": 0.02847293847803353, "acc_norm": 0.5522875816993464, "acc_norm_stderr": 0.02847293847803353 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5980707395498392, "acc_stderr": 0.027846476005930473, "acc_norm": 0.5980707395498392, "acc_norm_stderr": 0.027846476005930473 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5648148148148148, "acc_stderr": 0.027586006221607708, "acc_norm": 0.5648148148148148, "acc_norm_stderr": 0.027586006221607708 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4078014184397163, "acc_stderr": 0.029316011776343555, "acc_norm": 0.4078014184397163, "acc_norm_stderr": 0.029316011776343555 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.39895697522816165, "acc_stderr": 0.01250675765529367, "acc_norm": 0.39895697522816165, "acc_norm_stderr": 0.01250675765529367 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4338235294117647, "acc_stderr": 0.030105636570016636, "acc_norm": 0.4338235294117647, "acc_norm_stderr": 0.030105636570016636 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.49019607843137253, "acc_stderr": 0.020223946005074305, "acc_norm": 0.49019607843137253, "acc_norm_stderr": 0.020223946005074305 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5818181818181818, "acc_stderr": 0.04724577405731572, "acc_norm": 0.5818181818181818, "acc_norm_stderr": 0.04724577405731572 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6244897959183674, "acc_stderr": 0.03100120903989484, "acc_norm": 0.6244897959183674, "acc_norm_stderr": 0.03100120903989484 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6218905472636815, "acc_stderr": 0.034288678487786564, "acc_norm": 0.6218905472636815, "acc_norm_stderr": 0.034288678487786564 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-virology|5": { "acc": 0.42771084337349397, "acc_stderr": 0.038515976837185335, "acc_norm": 0.42771084337349397, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.695906432748538, "acc_stderr": 0.03528211258245231, "acc_norm": 0.695906432748538, "acc_norm_stderr": 0.03528211258245231 }, "harness|truthfulqa:mc|0": { "mc1": 0.33047735618115054, "mc1_stderr": 0.016466769613698307, "mc2": 0.46496694797516, "mc2_stderr": 0.015236674932834036 }, "harness|winogrande|5": { "acc": 0.734017363851618, "acc_stderr": 0.01241832315305105 }, "harness|gsm8k|5": { "acc": 0.21076573161485973, "acc_stderr": 0.011234280469030465 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b-16k
[ "region:us" ]
2023-12-09T15:56:28+00:00
{"pretty_name": "Evaluation run of hfl/chinese-alpaca-2-13b-16k", "dataset_summary": "Dataset automatically created during the evaluation run of model [hfl/chinese-alpaca-2-13b-16k](https://huggingface.co/hfl/chinese-alpaca-2-13b-16k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b-16k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T15:53:33.265685](https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b-16k/blob/main/results_2023-12-09T15-53-33.265685.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5126179344828111,\n \"acc_stderr\": 0.0342051274120513,\n \"acc_norm\": 0.5178843368987507,\n \"acc_norm_stderr\": 0.034949756392914415,\n \"mc1\": 0.33047735618115054,\n \"mc1_stderr\": 0.016466769613698307,\n \"mc2\": 0.46496694797516,\n \"mc2_stderr\": 0.015236674932834036\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5213310580204779,\n \"acc_stderr\": 0.014598087973127106,\n \"acc_norm\": 0.5503412969283277,\n \"acc_norm_stderr\": 0.014537144444284738\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5728938458474407,\n \"acc_stderr\": 0.004936470085238487,\n \"acc_norm\": 0.7741485759808803,\n \"acc_norm_stderr\": 0.0041728722829842005\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309173,\n \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309173\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5509433962264151,\n \"acc_stderr\": 0.030612730713641095,\n \"acc_norm\": 0.5509433962264151,\n \"acc_norm_stderr\": 0.030612730713641095\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n \"acc_stderr\": 0.038118909889404126,\n \"acc_norm\": 0.5086705202312138,\n \"acc_norm_stderr\": 0.038118909889404126\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.03106898596312215,\n \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.03106898596312215\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523853,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523853\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.041634530313028585,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.041634530313028585\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n \"acc_stderr\": 0.028129112709165904,\n \"acc_norm\": 0.5741935483870968,\n \"acc_norm_stderr\": 0.028129112709165904\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264715,\n \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264715\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325226,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325226\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.03182155050916646,\n \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.03182155050916646\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.0251891498947642,\n \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.0251891498947642\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881563,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881563\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7045871559633028,\n \"acc_stderr\": 0.019560619182976,\n \"acc_norm\": 0.7045871559633028,\n \"acc_norm_stderr\": 0.019560619182976\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373617,\n \"acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373617\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.70042194092827,\n \"acc_stderr\": 0.02981802474975309,\n \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.02981802474975309\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.03881891213334383,\n \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.03881891213334383\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n \"acc_stderr\": 0.02685345037700914,\n \"acc_norm\": 0.7863247863247863,\n \"acc_norm_stderr\": 0.02685345037700914\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956914,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956914\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7151979565772669,\n \"acc_stderr\": 0.016139174096522546,\n \"acc_norm\": 0.7151979565772669,\n \"acc_norm_stderr\": 0.016139174096522546\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.02651126136940924,\n \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.02651126136940924\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.014310999547961443,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.014310999547961443\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.02847293847803353,\n \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.02847293847803353\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.027586006221607708,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.027586006221607708\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39895697522816165,\n \"acc_stderr\": 0.01250675765529367,\n \"acc_norm\": 0.39895697522816165,\n \"acc_norm_stderr\": 0.01250675765529367\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4338235294117647,\n \"acc_stderr\": 0.030105636570016636,\n \"acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.030105636570016636\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.020223946005074305,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.020223946005074305\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.03528211258245231,\n \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.03528211258245231\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33047735618115054,\n \"mc1_stderr\": 0.016466769613698307,\n \"mc2\": 0.46496694797516,\n \"mc2_stderr\": 0.015236674932834036\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.734017363851618,\n \"acc_stderr\": 0.01241832315305105\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21076573161485973,\n \"acc_stderr\": 0.011234280469030465\n }\n}\n```", "repo_url": "https://huggingface.co/hfl/chinese-alpaca-2-13b-16k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-53-33.265685.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["**/details_harness|winogrande|5_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T15-53-33.265685.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T15_53_33.265685", "path": ["results_2023-12-09T15-53-33.265685.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T15-53-33.265685.parquet"]}]}]}
2023-12-09T15:57:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of hfl/chinese-alpaca-2-13b-16k ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model hfl/chinese-alpaca-2-13b-16k on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T15:53:33.265685(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of hfl/chinese-alpaca-2-13b-16k", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model hfl/chinese-alpaca-2-13b-16k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:53:33.265685(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of hfl/chinese-alpaca-2-13b-16k", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model hfl/chinese-alpaca-2-13b-16k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:53:33.265685(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 171, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of hfl/chinese-alpaca-2-13b-16k## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model hfl/chinese-alpaca-2-13b-16k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T15:53:33.265685(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
dbd0ade179d5c99171467fa184b5db4f207f8dcd
# Dataset Card for Evaluation run of Undi95/X-MythoChronos-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Undi95/X-MythoChronos-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Undi95/X-MythoChronos-13B](https://huggingface.co/Undi95/X-MythoChronos-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Undi95__X-MythoChronos-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T15:55:58.756519](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__X-MythoChronos-13B/blob/main/results_2023-12-09T15-55-58.756519.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5641085013010667, "acc_stderr": 0.0335879510752552, "acc_norm": 0.570142814951906, "acc_norm_stderr": 0.03430315611658459, "mc1": 0.37821297429620565, "mc1_stderr": 0.01697633590754687, "mc2": 0.535496493693775, "mc2_stderr": 0.015937525418247476 }, "harness|arc:challenge|25": { "acc": 0.5844709897610921, "acc_stderr": 0.014401366641216383, "acc_norm": 0.5972696245733788, "acc_norm_stderr": 0.01433223630679015 }, "harness|hellaswag|10": { "acc": 0.6448914558852819, "acc_stderr": 0.004775681871529864, "acc_norm": 0.8338976299541924, "acc_norm_stderr": 0.0037141188843173825 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5037037037037037, "acc_stderr": 0.04319223625811331, "acc_norm": 0.5037037037037037, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5657894736842105, "acc_stderr": 0.040335656678483205, "acc_norm": 0.5657894736842105, "acc_norm_stderr": 0.040335656678483205 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5886792452830188, "acc_stderr": 0.030285009259009798, "acc_norm": 0.5886792452830188, "acc_norm_stderr": 0.030285009259009798 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6111111111111112, "acc_stderr": 0.04076663253918567, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.04076663253918567 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.04999999999999999, "acc_norm": 0.45, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5202312138728323, "acc_stderr": 0.03809342081273957, "acc_norm": 0.5202312138728323, "acc_norm_stderr": 0.03809342081273957 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.04220773659171452, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.04220773659171452 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4723404255319149, "acc_stderr": 0.03263597118409769, "acc_norm": 0.4723404255319149, "acc_norm_stderr": 0.03263597118409769 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.044346007015849245, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.044346007015849245 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.041546596717075474, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.041546596717075474 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.29894179894179895, "acc_stderr": 0.023577604791655802, "acc_norm": 0.29894179894179895, "acc_norm_stderr": 0.023577604791655802 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04216370213557835, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04216370213557835 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6645161290322581, "acc_stderr": 0.02686020644472434, "acc_norm": 0.6645161290322581, "acc_norm_stderr": 0.02686020644472434 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4187192118226601, "acc_stderr": 0.03471192860518468, "acc_norm": 0.4187192118226601, "acc_norm_stderr": 0.03471192860518468 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6727272727272727, "acc_stderr": 0.03663974994391244, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.03663974994391244 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7222222222222222, "acc_stderr": 0.031911782267135466, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.031911782267135466 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8082901554404145, "acc_stderr": 0.02840895362624526, "acc_norm": 0.8082901554404145, "acc_norm_stderr": 0.02840895362624526 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5256410256410257, "acc_stderr": 0.025317649726448663, "acc_norm": 0.5256410256410257, "acc_norm_stderr": 0.025317649726448663 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028597, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028597 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5756302521008403, "acc_stderr": 0.032104790510157764, "acc_norm": 0.5756302521008403, "acc_norm_stderr": 0.032104790510157764 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.03879687024073327, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.03879687024073327 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7467889908256881, "acc_stderr": 0.01864407304137504, "acc_norm": 0.7467889908256881, "acc_norm_stderr": 0.01864407304137504 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4027777777777778, "acc_stderr": 0.033448873829978666, "acc_norm": 0.4027777777777778, "acc_norm_stderr": 0.033448873829978666 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7450980392156863, "acc_stderr": 0.030587591351604243, "acc_norm": 0.7450980392156863, "acc_norm_stderr": 0.030587591351604243 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.027479744550808503, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.027479744550808503 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6412213740458015, "acc_stderr": 0.04206739313864908, "acc_norm": 0.6412213740458015, "acc_norm_stderr": 0.04206739313864908 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.03941897526516303, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.03941897526516303 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7222222222222222, "acc_stderr": 0.043300437496507416, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.043300437496507416 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6871165644171779, "acc_stderr": 0.036429145782924055, "acc_norm": 0.6871165644171779, "acc_norm_stderr": 0.036429145782924055 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.375, "acc_stderr": 0.04595091388086298, "acc_norm": 0.375, "acc_norm_stderr": 0.04595091388086298 }, "harness|hendrycksTest-management|5": { "acc": 0.6990291262135923, "acc_stderr": 0.045416094465039476, "acc_norm": 0.6990291262135923, "acc_norm_stderr": 0.045416094465039476 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8333333333333334, "acc_stderr": 0.024414947304543678, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.024414947304543678 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.55, "acc_stderr": 0.04999999999999999, "acc_norm": 0.55, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7611749680715197, "acc_stderr": 0.015246803197398675, "acc_norm": 0.7611749680715197, "acc_norm_stderr": 0.015246803197398675 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6416184971098265, "acc_stderr": 0.025816756791584194, "acc_norm": 0.6416184971098265, "acc_norm_stderr": 0.025816756791584194 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.488268156424581, "acc_stderr": 0.016717897676932162, "acc_norm": 0.488268156424581, "acc_norm_stderr": 0.016717897676932162 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6111111111111112, "acc_stderr": 0.027914055510467998, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.027914055510467998 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6463022508038585, "acc_stderr": 0.027155208103200865, "acc_norm": 0.6463022508038585, "acc_norm_stderr": 0.027155208103200865 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6419753086419753, "acc_stderr": 0.026675611926037106, "acc_norm": 0.6419753086419753, "acc_norm_stderr": 0.026675611926037106 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.41843971631205673, "acc_stderr": 0.02942799403941999, "acc_norm": 0.41843971631205673, "acc_norm_stderr": 0.02942799403941999 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44002607561929596, "acc_stderr": 0.012678037478574513, "acc_norm": 0.44002607561929596, "acc_norm_stderr": 0.012678037478574513 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5294117647058824, "acc_stderr": 0.03032024326500413, "acc_norm": 0.5294117647058824, "acc_norm_stderr": 0.03032024326500413 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5751633986928104, "acc_stderr": 0.019997973035458333, "acc_norm": 0.5751633986928104, "acc_norm_stderr": 0.019997973035458333 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.046075820907199756, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.046075820907199756 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6326530612244898, "acc_stderr": 0.03086214492108756, "acc_norm": 0.6326530612244898, "acc_norm_stderr": 0.03086214492108756 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7661691542288557, "acc_stderr": 0.029929415408348384, "acc_norm": 0.7661691542288557, "acc_norm_stderr": 0.029929415408348384 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.46987951807228917, "acc_stderr": 0.03885425420866766, "acc_norm": 0.46987951807228917, "acc_norm_stderr": 0.03885425420866766 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7894736842105263, "acc_stderr": 0.031267817146631786, "acc_norm": 0.7894736842105263, "acc_norm_stderr": 0.031267817146631786 }, "harness|truthfulqa:mc|0": { "mc1": 0.37821297429620565, "mc1_stderr": 0.01697633590754687, "mc2": 0.535496493693775, "mc2_stderr": 0.015937525418247476 }, "harness|winogrande|5": { "acc": 0.744277821625888, "acc_stderr": 0.012261253845440474 }, "harness|gsm8k|5": { "acc": 0.22971948445792267, "acc_stderr": 0.011586857544997501 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Undi95__X-MythoChronos-13B
[ "region:us" ]
2023-12-09T15:58:54+00:00
{"pretty_name": "Evaluation run of Undi95/X-MythoChronos-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Undi95/X-MythoChronos-13B](https://huggingface.co/Undi95/X-MythoChronos-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__X-MythoChronos-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T15:55:58.756519](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__X-MythoChronos-13B/blob/main/results_2023-12-09T15-55-58.756519.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5641085013010667,\n \"acc_stderr\": 0.0335879510752552,\n \"acc_norm\": 0.570142814951906,\n \"acc_norm_stderr\": 0.03430315611658459,\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.535496493693775,\n \"mc2_stderr\": 0.015937525418247476\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216383,\n \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.01433223630679015\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6448914558852819,\n \"acc_stderr\": 0.004775681871529864,\n \"acc_norm\": 0.8338976299541924,\n \"acc_norm_stderr\": 0.0037141188843173825\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483205,\n \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483205\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009798,\n \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009798\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29894179894179895,\n \"acc_stderr\": 0.023577604791655802,\n \"acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.023577604791655802\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n \"acc_stderr\": 0.02686020644472434,\n \"acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.02686020644472434\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391244,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391244\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.031911782267135466,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.031911782267135466\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624526,\n \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624526\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.025317649726448663,\n \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.025317649726448663\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7467889908256881,\n \"acc_stderr\": 0.01864407304137504,\n \"acc_norm\": 0.7467889908256881,\n \"acc_norm_stderr\": 0.01864407304137504\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.033448873829978666,\n \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.033448873829978666\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604243,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604243\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.024414947304543678,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.024414947304543678\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n \"acc_stderr\": 0.015246803197398675,\n \"acc_norm\": 0.7611749680715197,\n \"acc_norm_stderr\": 0.015246803197398675\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584194,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584194\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.488268156424581,\n \"acc_stderr\": 0.016717897676932162,\n \"acc_norm\": 0.488268156424581,\n \"acc_norm_stderr\": 0.016717897676932162\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510467998,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510467998\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.026675611926037106,\n \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.026675611926037106\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44002607561929596,\n \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.44002607561929596,\n \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5751633986928104,\n \"acc_stderr\": 0.019997973035458333,\n \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.019997973035458333\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.535496493693775,\n \"mc2_stderr\": 0.015937525418247476\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440474\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.22971948445792267,\n \"acc_stderr\": 0.011586857544997501\n }\n}\n```", "repo_url": "https://huggingface.co/Undi95/X-MythoChronos-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-55-58.756519.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["**/details_harness|winogrande|5_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T15-55-58.756519.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T15_55_58.756519", "path": ["results_2023-12-09T15-55-58.756519.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T15-55-58.756519.parquet"]}]}]}
2023-12-09T15:59:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Undi95/X-MythoChronos-13B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Undi95/X-MythoChronos-13B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T15:55:58.756519(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Undi95/X-MythoChronos-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/X-MythoChronos-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:55:58.756519(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Undi95/X-MythoChronos-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/X-MythoChronos-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T15:55:58.756519(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Undi95/X-MythoChronos-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/X-MythoChronos-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T15:55:58.756519(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
d13c7b39903c0ae50b957a02f5d03df23dcfebe9
# Dataset Card for Evaluation run of hfl/chinese-alpaca-2-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/hfl/chinese-alpaca-2-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [hfl/chinese-alpaca-2-13b](https://huggingface.co/hfl/chinese-alpaca-2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T16:00:55.681332](https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b/blob/main/results_2023-12-09T16-00-55.681332.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5502321350314341, "acc_stderr": 0.033838534455358144, "acc_norm": 0.5559937862519342, "acc_norm_stderr": 0.03456092398331123, "mc1": 0.34761321909424725, "mc1_stderr": 0.016670769188897303, "mc2": 0.5022258550236057, "mc2_stderr": 0.015284175194421176 }, "harness|arc:challenge|25": { "acc": 0.5418088737201365, "acc_stderr": 0.014560220308714697, "acc_norm": 0.5870307167235495, "acc_norm_stderr": 0.014388344935398329 }, "harness|hellaswag|10": { "acc": 0.59699263095001, "acc_stderr": 0.004894997736719051, "acc_norm": 0.7975502887870942, "acc_norm_stderr": 0.004010043978333125 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.04461960433384741, "acc_norm": 0.27, "acc_norm_stderr": 0.04461960433384741 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4888888888888889, "acc_stderr": 0.04318275491977976, "acc_norm": 0.4888888888888889, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5921052631578947, "acc_stderr": 0.039993097127774734, "acc_norm": 0.5921052631578947, "acc_norm_stderr": 0.039993097127774734 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5886792452830188, "acc_stderr": 0.030285009259009794, "acc_norm": 0.5886792452830188, "acc_norm_stderr": 0.030285009259009794 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5625, "acc_stderr": 0.04148415739394154, "acc_norm": 0.5625, "acc_norm_stderr": 0.04148415739394154 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5260115606936416, "acc_stderr": 0.038073017265045125, "acc_norm": 0.5260115606936416, "acc_norm_stderr": 0.038073017265045125 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.35294117647058826, "acc_stderr": 0.04755129616062946, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.04755129616062946 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.37872340425531914, "acc_stderr": 0.03170995606040655, "acc_norm": 0.37872340425531914, "acc_norm_stderr": 0.03170995606040655 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.32275132275132273, "acc_stderr": 0.024078943243597016, "acc_norm": 0.32275132275132273, "acc_norm_stderr": 0.024078943243597016 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3492063492063492, "acc_stderr": 0.04263906892795132, "acc_norm": 0.3492063492063492, "acc_norm_stderr": 0.04263906892795132 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.667741935483871, "acc_stderr": 0.0267955608481228, "acc_norm": 0.667741935483871, "acc_norm_stderr": 0.0267955608481228 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4482758620689655, "acc_stderr": 0.03499113137676744, "acc_norm": 0.4482758620689655, "acc_norm_stderr": 0.03499113137676744 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6606060606060606, "acc_stderr": 0.03697442205031596, "acc_norm": 0.6606060606060606, "acc_norm_stderr": 0.03697442205031596 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.696969696969697, "acc_stderr": 0.032742879140268674, "acc_norm": 0.696969696969697, "acc_norm_stderr": 0.032742879140268674 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7979274611398963, "acc_stderr": 0.02897908979429673, "acc_norm": 0.7979274611398963, "acc_norm_stderr": 0.02897908979429673 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5102564102564102, "acc_stderr": 0.025345672221942374, "acc_norm": 0.5102564102564102, "acc_norm_stderr": 0.025345672221942374 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2851851851851852, "acc_stderr": 0.027528599210340492, "acc_norm": 0.2851851851851852, "acc_norm_stderr": 0.027528599210340492 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5798319327731093, "acc_stderr": 0.03206183783236152, "acc_norm": 0.5798319327731093, "acc_norm_stderr": 0.03206183783236152 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7541284403669725, "acc_stderr": 0.01846194096870843, "acc_norm": 0.7541284403669725, "acc_norm_stderr": 0.01846194096870843 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4722222222222222, "acc_stderr": 0.0340470532865388, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7549019607843137, "acc_stderr": 0.030190282453501943, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.030190282453501943 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7088607594936709, "acc_stderr": 0.029571601065753374, "acc_norm": 0.7088607594936709, "acc_norm_stderr": 0.029571601065753374 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6278026905829597, "acc_stderr": 0.03244305283008731, "acc_norm": 0.6278026905829597, "acc_norm_stderr": 0.03244305283008731 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6030534351145038, "acc_stderr": 0.04291135671009224, "acc_norm": 0.6030534351145038, "acc_norm_stderr": 0.04291135671009224 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7355371900826446, "acc_stderr": 0.04026187527591205, "acc_norm": 0.7355371900826446, "acc_norm_stderr": 0.04026187527591205 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.042365112580946336, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6134969325153374, "acc_stderr": 0.038258255488486076, "acc_norm": 0.6134969325153374, "acc_norm_stderr": 0.038258255488486076 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.36607142857142855, "acc_stderr": 0.0457237235873743, "acc_norm": 0.36607142857142855, "acc_norm_stderr": 0.0457237235873743 }, "harness|hendrycksTest-management|5": { "acc": 0.6990291262135923, "acc_stderr": 0.045416094465039476, "acc_norm": 0.6990291262135923, "acc_norm_stderr": 0.045416094465039476 }, "harness|hendrycksTest-marketing|5": { "acc": 0.811965811965812, "acc_stderr": 0.025598193686652244, "acc_norm": 0.811965811965812, "acc_norm_stderr": 0.025598193686652244 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465918, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465918 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7522349936143039, "acc_stderr": 0.015438083080568972, "acc_norm": 0.7522349936143039, "acc_norm_stderr": 0.015438083080568972 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6242774566473989, "acc_stderr": 0.02607431485165708, "acc_norm": 0.6242774566473989, "acc_norm_stderr": 0.02607431485165708 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3865921787709497, "acc_stderr": 0.016286674879101026, "acc_norm": 0.3865921787709497, "acc_norm_stderr": 0.016286674879101026 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5947712418300654, "acc_stderr": 0.028110928492809075, "acc_norm": 0.5947712418300654, "acc_norm_stderr": 0.028110928492809075 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6366559485530546, "acc_stderr": 0.02731684767419271, "acc_norm": 0.6366559485530546, "acc_norm_stderr": 0.02731684767419271 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6018518518518519, "acc_stderr": 0.027237415094592474, "acc_norm": 0.6018518518518519, "acc_norm_stderr": 0.027237415094592474 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.40425531914893614, "acc_stderr": 0.029275532159704725, "acc_norm": 0.40425531914893614, "acc_norm_stderr": 0.029275532159704725 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4230769230769231, "acc_stderr": 0.01261820406658839, "acc_norm": 0.4230769230769231, "acc_norm_stderr": 0.01261820406658839 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.47794117647058826, "acc_stderr": 0.030343264224213535, "acc_norm": 0.47794117647058826, "acc_norm_stderr": 0.030343264224213535 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5245098039215687, "acc_stderr": 0.020203517280261436, "acc_norm": 0.5245098039215687, "acc_norm_stderr": 0.020203517280261436 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6090909090909091, "acc_stderr": 0.046737523336702384, "acc_norm": 0.6090909090909091, "acc_norm_stderr": 0.046737523336702384 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5959183673469388, "acc_stderr": 0.03141470802586589, "acc_norm": 0.5959183673469388, "acc_norm_stderr": 0.03141470802586589 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6865671641791045, "acc_stderr": 0.032801882053486456, "acc_norm": 0.6865671641791045, "acc_norm_stderr": 0.032801882053486456 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.038612291966536934, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-virology|5": { "acc": 0.463855421686747, "acc_stderr": 0.03882310850890593, "acc_norm": 0.463855421686747, "acc_norm_stderr": 0.03882310850890593 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7543859649122807, "acc_stderr": 0.03301405946987251, "acc_norm": 0.7543859649122807, "acc_norm_stderr": 0.03301405946987251 }, "harness|truthfulqa:mc|0": { "mc1": 0.34761321909424725, "mc1_stderr": 0.016670769188897303, "mc2": 0.5022258550236057, "mc2_stderr": 0.015284175194421176 }, "harness|winogrande|5": { "acc": 0.7561168113654302, "acc_stderr": 0.012068923278908189 }, "harness|gsm8k|5": { "acc": 0.25018953752843065, "acc_stderr": 0.011930334350873352 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b
[ "region:us" ]
2023-12-09T16:03:50+00:00
{"pretty_name": "Evaluation run of hfl/chinese-alpaca-2-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [hfl/chinese-alpaca-2-13b](https://huggingface.co/hfl/chinese-alpaca-2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T16:00:55.681332](https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b/blob/main/results_2023-12-09T16-00-55.681332.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5502321350314341,\n \"acc_stderr\": 0.033838534455358144,\n \"acc_norm\": 0.5559937862519342,\n \"acc_norm_stderr\": 0.03456092398331123,\n \"mc1\": 0.34761321909424725,\n \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.5022258550236057,\n \"mc2_stderr\": 0.015284175194421176\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5418088737201365,\n \"acc_stderr\": 0.014560220308714697,\n \"acc_norm\": 0.5870307167235495,\n \"acc_norm_stderr\": 0.014388344935398329\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.59699263095001,\n \"acc_stderr\": 0.004894997736719051,\n \"acc_norm\": 0.7975502887870942,\n \"acc_norm_stderr\": 0.004010043978333125\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009794,\n \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009794\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.37872340425531914,\n \"acc_stderr\": 0.03170995606040655,\n \"acc_norm\": 0.37872340425531914,\n \"acc_norm_stderr\": 0.03170995606040655\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n \"acc_stderr\": 0.0267955608481228,\n \"acc_norm\": 0.667741935483871,\n \"acc_norm_stderr\": 0.0267955608481228\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7541284403669725,\n \"acc_stderr\": 0.01846194096870843,\n \"acc_norm\": 0.7541284403669725,\n \"acc_norm_stderr\": 0.01846194096870843\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501943,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501943\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7088607594936709,\n \"acc_stderr\": 0.029571601065753374,\n \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.029571601065753374\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6134969325153374,\n \"acc_stderr\": 0.038258255488486076,\n \"acc_norm\": 0.6134969325153374,\n \"acc_norm_stderr\": 0.038258255488486076\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n \"acc_stderr\": 0.025598193686652244,\n \"acc_norm\": 0.811965811965812,\n \"acc_norm_stderr\": 0.025598193686652244\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465918,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465918\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7522349936143039,\n \"acc_stderr\": 0.015438083080568972,\n \"acc_norm\": 0.7522349936143039,\n \"acc_norm_stderr\": 0.015438083080568972\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3865921787709497,\n \"acc_stderr\": 0.016286674879101026,\n \"acc_norm\": 0.3865921787709497,\n \"acc_norm_stderr\": 0.016286674879101026\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809075,\n \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809075\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n \"acc_stderr\": 0.02731684767419271,\n \"acc_norm\": 0.6366559485530546,\n \"acc_norm_stderr\": 0.02731684767419271\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.027237415094592474,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.027237415094592474\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4230769230769231,\n \"acc_stderr\": 0.01261820406658839,\n \"acc_norm\": 0.4230769230769231,\n \"acc_norm_stderr\": 0.01261820406658839\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213535,\n \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213535\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5245098039215687,\n \"acc_stderr\": 0.020203517280261436,\n \"acc_norm\": 0.5245098039215687,\n \"acc_norm_stderr\": 0.020203517280261436\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.046737523336702384,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.046737523336702384\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5959183673469388,\n \"acc_stderr\": 0.03141470802586589,\n \"acc_norm\": 0.5959183673469388,\n \"acc_norm_stderr\": 0.03141470802586589\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n \"acc_stderr\": 0.032801882053486456,\n \"acc_norm\": 0.6865671641791045,\n \"acc_norm_stderr\": 0.032801882053486456\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.03301405946987251,\n \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.03301405946987251\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.5022258550236057,\n \"mc2_stderr\": 0.015284175194421176\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7561168113654302,\n \"acc_stderr\": 0.012068923278908189\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.25018953752843065,\n \"acc_stderr\": 0.011930334350873352\n }\n}\n```", "repo_url": "https://huggingface.co/hfl/chinese-alpaca-2-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-00-55.681332.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["**/details_harness|winogrande|5_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T16-00-55.681332.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T16_00_55.681332", "path": ["results_2023-12-09T16-00-55.681332.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T16-00-55.681332.parquet"]}]}]}
2023-12-09T16:04:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of hfl/chinese-alpaca-2-13b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model hfl/chinese-alpaca-2-13b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T16:00:55.681332(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of hfl/chinese-alpaca-2-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model hfl/chinese-alpaca-2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T16:00:55.681332(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of hfl/chinese-alpaca-2-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model hfl/chinese-alpaca-2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T16:00:55.681332(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of hfl/chinese-alpaca-2-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model hfl/chinese-alpaca-2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T16:00:55.681332(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
5972192eb27c182a41bf54a7459babaed87599b8
# Dataset Card for Evaluation run of Severian/ANIMA-Nectar-v3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Severian/ANIMA-Nectar-v3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Severian/ANIMA-Nectar-v3](https://huggingface.co/Severian/ANIMA-Nectar-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Severian__ANIMA-Nectar-v3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T16:02:02.105784](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Nectar-v3/blob/main/results_2023-12-09T16-02-02.105784.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5279231991868109, "acc_stderr": 0.034095739528534785, "acc_norm": 0.5365465936132023, "acc_norm_stderr": 0.0349316183151297, "mc1": 0.3108935128518972, "mc1_stderr": 0.016203316673559693, "mc2": 0.4616473915095851, "mc2_stderr": 0.014431098139511664 }, "harness|arc:challenge|25": { "acc": 0.454778156996587, "acc_stderr": 0.014551507060836353, "acc_norm": 0.4948805460750853, "acc_norm_stderr": 0.014610624890309154 }, "harness|hellaswag|10": { "acc": 0.5621390161322446, "acc_stderr": 0.004951097802775953, "acc_norm": 0.7599083847839075, "acc_norm_stderr": 0.004262659388824526 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4888888888888889, "acc_stderr": 0.04318275491977976, "acc_norm": 0.4888888888888889, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.506578947368421, "acc_stderr": 0.040685900502249704, "acc_norm": 0.506578947368421, "acc_norm_stderr": 0.040685900502249704 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5962264150943396, "acc_stderr": 0.03019761160019795, "acc_norm": 0.5962264150943396, "acc_norm_stderr": 0.03019761160019795 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5416666666666666, "acc_stderr": 0.04166666666666665, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.04166666666666665 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.04999999999999999, "acc_norm": 0.45, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5433526011560693, "acc_stderr": 0.03798106566014498, "acc_norm": 0.5433526011560693, "acc_norm_stderr": 0.03798106566014498 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.04576665403207763, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.04576665403207763 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4723404255319149, "acc_stderr": 0.03263597118409769, "acc_norm": 0.4723404255319149, "acc_norm_stderr": 0.03263597118409769 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.37719298245614036, "acc_stderr": 0.04559522141958216, "acc_norm": 0.37719298245614036, "acc_norm_stderr": 0.04559522141958216 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5103448275862069, "acc_stderr": 0.04165774775728763, "acc_norm": 0.5103448275862069, "acc_norm_stderr": 0.04165774775728763 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3835978835978836, "acc_stderr": 0.025043757318520196, "acc_norm": 0.3835978835978836, "acc_norm_stderr": 0.025043757318520196 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3253968253968254, "acc_stderr": 0.041905964388711366, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.041905964388711366 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6258064516129033, "acc_stderr": 0.027528904299845704, "acc_norm": 0.6258064516129033, "acc_norm_stderr": 0.027528904299845704 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.43842364532019706, "acc_stderr": 0.03491207857486517, "acc_norm": 0.43842364532019706, "acc_norm_stderr": 0.03491207857486517 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6484848484848484, "acc_stderr": 0.037282069986826503, "acc_norm": 0.6484848484848484, "acc_norm_stderr": 0.037282069986826503 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6767676767676768, "acc_stderr": 0.033322999210706444, "acc_norm": 0.6767676767676768, "acc_norm_stderr": 0.033322999210706444 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7098445595854922, "acc_stderr": 0.032752644677915166, "acc_norm": 0.7098445595854922, "acc_norm_stderr": 0.032752644677915166 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4641025641025641, "acc_stderr": 0.02528558599001784, "acc_norm": 0.4641025641025641, "acc_norm_stderr": 0.02528558599001784 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.28888888888888886, "acc_stderr": 0.027634907264178544, "acc_norm": 0.28888888888888886, "acc_norm_stderr": 0.027634907264178544 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.46218487394957986, "acc_stderr": 0.032385469487589795, "acc_norm": 0.46218487394957986, "acc_norm_stderr": 0.032385469487589795 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.0395802723112157, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.0395802723112157 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7045871559633028, "acc_stderr": 0.019560619182976, "acc_norm": 0.7045871559633028, "acc_norm_stderr": 0.019560619182976 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3472222222222222, "acc_stderr": 0.032468872436376486, "acc_norm": 0.3472222222222222, "acc_norm_stderr": 0.032468872436376486 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6176470588235294, "acc_stderr": 0.034107853389047205, "acc_norm": 0.6176470588235294, "acc_norm_stderr": 0.034107853389047205 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6624472573839663, "acc_stderr": 0.030781549102026223, "acc_norm": 0.6624472573839663, "acc_norm_stderr": 0.030781549102026223 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.600896860986547, "acc_stderr": 0.03286745312567961, "acc_norm": 0.600896860986547, "acc_norm_stderr": 0.03286745312567961 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6335877862595419, "acc_stderr": 0.04225875451969638, "acc_norm": 0.6335877862595419, "acc_norm_stderr": 0.04225875451969638 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6776859504132231, "acc_stderr": 0.042664163633521685, "acc_norm": 0.6776859504132231, "acc_norm_stderr": 0.042664163633521685 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6574074074074074, "acc_stderr": 0.045879047413018105, "acc_norm": 0.6574074074074074, "acc_norm_stderr": 0.045879047413018105 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6687116564417178, "acc_stderr": 0.03697983910025588, "acc_norm": 0.6687116564417178, "acc_norm_stderr": 0.03697983910025588 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.6990291262135923, "acc_stderr": 0.045416094465039476, "acc_norm": 0.6990291262135923, "acc_norm_stderr": 0.045416094465039476 }, "harness|hendrycksTest-marketing|5": { "acc": 0.811965811965812, "acc_stderr": 0.025598193686652265, "acc_norm": 0.811965811965812, "acc_norm_stderr": 0.025598193686652265 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7113665389527458, "acc_stderr": 0.016203792703197786, "acc_norm": 0.7113665389527458, "acc_norm_stderr": 0.016203792703197786 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5346820809248555, "acc_stderr": 0.026854257928258893, "acc_norm": 0.5346820809248555, "acc_norm_stderr": 0.026854257928258893 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.33631284916201115, "acc_stderr": 0.015801003729145894, "acc_norm": 0.33631284916201115, "acc_norm_stderr": 0.015801003729145894 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5228758169934641, "acc_stderr": 0.028599936776089782, "acc_norm": 0.5228758169934641, "acc_norm_stderr": 0.028599936776089782 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6141479099678456, "acc_stderr": 0.027648149599751468, "acc_norm": 0.6141479099678456, "acc_norm_stderr": 0.027648149599751468 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6265432098765432, "acc_stderr": 0.02691500301138016, "acc_norm": 0.6265432098765432, "acc_norm_stderr": 0.02691500301138016 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3617021276595745, "acc_stderr": 0.028663820147199495, "acc_norm": 0.3617021276595745, "acc_norm_stderr": 0.028663820147199495 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3578878748370274, "acc_stderr": 0.012243563850490313, "acc_norm": 0.3578878748370274, "acc_norm_stderr": 0.012243563850490313 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4411764705882353, "acc_stderr": 0.03016191193076711, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.03016191193076711 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.49836601307189543, "acc_stderr": 0.020227726838150117, "acc_norm": 0.49836601307189543, "acc_norm_stderr": 0.020227726838150117 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.04494290866252089, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.04494290866252089 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6081632653061224, "acc_stderr": 0.031251275910891656, "acc_norm": 0.6081632653061224, "acc_norm_stderr": 0.031251275910891656 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6766169154228856, "acc_stderr": 0.03307615947979033, "acc_norm": 0.6766169154228856, "acc_norm_stderr": 0.03307615947979033 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036624, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036624 }, "harness|hendrycksTest-virology|5": { "acc": 0.42168674698795183, "acc_stderr": 0.03844453181770917, "acc_norm": 0.42168674698795183, "acc_norm_stderr": 0.03844453181770917 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7134502923976608, "acc_stderr": 0.03467826685703826, "acc_norm": 0.7134502923976608, "acc_norm_stderr": 0.03467826685703826 }, "harness|truthfulqa:mc|0": { "mc1": 0.3108935128518972, "mc1_stderr": 0.016203316673559693, "mc2": 0.4616473915095851, "mc2_stderr": 0.014431098139511664 }, "harness|winogrande|5": { "acc": 0.7371744277821626, "acc_stderr": 0.01237092252726201 }, "harness|gsm8k|5": { "acc": 0.047763457164518575, "acc_stderr": 0.00587438753622932 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Severian__ANIMA-Nectar-v3
[ "region:us" ]
2023-12-09T16:04:52+00:00
{"pretty_name": "Evaluation run of Severian/ANIMA-Nectar-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [Severian/ANIMA-Nectar-v3](https://huggingface.co/Severian/ANIMA-Nectar-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__ANIMA-Nectar-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T16:02:02.105784](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Nectar-v3/blob/main/results_2023-12-09T16-02-02.105784.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5279231991868109,\n \"acc_stderr\": 0.034095739528534785,\n \"acc_norm\": 0.5365465936132023,\n \"acc_norm_stderr\": 0.0349316183151297,\n \"mc1\": 0.3108935128518972,\n \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.4616473915095851,\n \"mc2_stderr\": 0.014431098139511664\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.454778156996587,\n \"acc_stderr\": 0.014551507060836353,\n \"acc_norm\": 0.4948805460750853,\n \"acc_norm_stderr\": 0.014610624890309154\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5621390161322446,\n \"acc_stderr\": 0.004951097802775953,\n \"acc_norm\": 0.7599083847839075,\n \"acc_norm_stderr\": 0.004262659388824526\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n \"acc_stderr\": 0.027528904299845704,\n \"acc_norm\": 0.6258064516129033,\n \"acc_norm_stderr\": 0.027528904299845704\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486517,\n \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486517\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6767676767676768,\n \"acc_stderr\": 0.033322999210706444,\n \"acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.033322999210706444\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7098445595854922,\n \"acc_stderr\": 0.032752644677915166,\n \"acc_norm\": 0.7098445595854922,\n \"acc_norm_stderr\": 0.032752644677915166\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.02528558599001784,\n \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.02528558599001784\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.46218487394957986,\n \"acc_stderr\": 0.032385469487589795,\n \"acc_norm\": 0.46218487394957986,\n \"acc_norm_stderr\": 0.032385469487589795\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7045871559633028,\n \"acc_stderr\": 0.019560619182976,\n \"acc_norm\": 0.7045871559633028,\n \"acc_norm_stderr\": 0.019560619182976\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.034107853389047205,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.034107853389047205\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6624472573839663,\n \"acc_stderr\": 0.030781549102026223,\n \"acc_norm\": 0.6624472573839663,\n \"acc_norm_stderr\": 0.030781549102026223\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n \"acc_stderr\": 0.025598193686652265,\n \"acc_norm\": 0.811965811965812,\n \"acc_norm_stderr\": 0.025598193686652265\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7113665389527458,\n \"acc_stderr\": 0.016203792703197786,\n \"acc_norm\": 0.7113665389527458,\n \"acc_norm_stderr\": 0.016203792703197786\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5346820809248555,\n \"acc_stderr\": 0.026854257928258893,\n \"acc_norm\": 0.5346820809248555,\n \"acc_norm_stderr\": 0.026854257928258893\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33631284916201115,\n \"acc_stderr\": 0.015801003729145894,\n \"acc_norm\": 0.33631284916201115,\n \"acc_norm_stderr\": 0.015801003729145894\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5228758169934641,\n \"acc_stderr\": 0.028599936776089782,\n \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.028599936776089782\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n \"acc_stderr\": 0.027648149599751468,\n \"acc_norm\": 0.6141479099678456,\n \"acc_norm_stderr\": 0.027648149599751468\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.02691500301138016,\n \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.02691500301138016\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199495,\n \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199495\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3578878748370274,\n \"acc_stderr\": 0.012243563850490313,\n \"acc_norm\": 0.3578878748370274,\n \"acc_norm_stderr\": 0.012243563850490313\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.03016191193076711,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.03016191193076711\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.49836601307189543,\n \"acc_stderr\": 0.020227726838150117,\n \"acc_norm\": 0.49836601307189543,\n \"acc_norm_stderr\": 0.020227726838150117\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n \"acc_stderr\": 0.03307615947979033,\n \"acc_norm\": 0.6766169154228856,\n \"acc_norm_stderr\": 0.03307615947979033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3108935128518972,\n \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.4616473915095851,\n \"mc2_stderr\": 0.014431098139511664\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7371744277821626,\n \"acc_stderr\": 0.01237092252726201\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.047763457164518575,\n \"acc_stderr\": 0.00587438753622932\n }\n}\n```", "repo_url": "https://huggingface.co/Severian/ANIMA-Nectar-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-02-02.105784.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["**/details_harness|winogrande|5_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T16-02-02.105784.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T16_02_02.105784", "path": ["results_2023-12-09T16-02-02.105784.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T16-02-02.105784.parquet"]}]}]}
2023-12-09T16:05:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Severian/ANIMA-Nectar-v3 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Severian/ANIMA-Nectar-v3 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T16:02:02.105784(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Severian/ANIMA-Nectar-v3", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Severian/ANIMA-Nectar-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T16:02:02.105784(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Severian/ANIMA-Nectar-v3", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Severian/ANIMA-Nectar-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T16:02:02.105784(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 170, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Severian/ANIMA-Nectar-v3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Severian/ANIMA-Nectar-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T16:02:02.105784(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
95535f82053c55c9a4f61b29238873af46314aba
# Dataset Card for Evaluation run of augmxnt/shisa-7b-v1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/augmxnt/shisa-7b-v1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [augmxnt/shisa-7b-v1](https://huggingface.co/augmxnt/shisa-7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_augmxnt__shisa-7b-v1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T16:04:45.905043](https://huggingface.co/datasets/open-llm-leaderboard/details_augmxnt__shisa-7b-v1/blob/main/results_2023-12-09T16-04-45.905043.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2538930287407982, "acc_stderr": 0.030165648669279056, "acc_norm": 0.2461752509143941, "acc_norm_stderr": 0.030745578518168466, "mc1": 0.3561811505507956, "mc1_stderr": 0.016763790728446335, "mc2": 0.5249120169730028, "mc2_stderr": 0.015465385283654536 }, "harness|arc:challenge|25": { "acc": 0.5170648464163823, "acc_stderr": 0.0146028783885366, "acc_norm": 0.5614334470989761, "acc_norm_stderr": 0.014500682618212864 }, "harness|hellaswag|10": { "acc": 0.59699263095001, "acc_stderr": 0.00489499773671905, "acc_norm": 0.7862975502887871, "acc_norm_stderr": 0.00409081394822023 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.18518518518518517, "acc_stderr": 0.03355677216313142, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.03355677216313142 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.02528839450289137, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.02528839450289137 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813365, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813365 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.20899470899470898, "acc_stderr": 0.02094048156533486, "acc_norm": 0.20899470899470898, "acc_norm_stderr": 0.02094048156533486 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04040610178208841, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04040610178208841 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.1774193548387097, "acc_stderr": 0.02173254068932927, "acc_norm": 0.1774193548387097, "acc_norm_stderr": 0.02173254068932927 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.15270935960591134, "acc_stderr": 0.02530890453938063, "acc_norm": 0.15270935960591134, "acc_norm_stderr": 0.02530890453938063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.17676767676767677, "acc_stderr": 0.027178752639044915, "acc_norm": 0.17676767676767677, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860664, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860664 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.20256410256410257, "acc_stderr": 0.020377660970371372, "acc_norm": 0.20256410256410257, "acc_norm_stderr": 0.020377660970371372 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2111111111111111, "acc_stderr": 0.024882116857655075, "acc_norm": 0.2111111111111111, "acc_norm_stderr": 0.024882116857655075 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21008403361344538, "acc_stderr": 0.026461398717471874, "acc_norm": 0.21008403361344538, "acc_norm_stderr": 0.026461398717471874 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.03257847384436776, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.03257847384436776 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1926605504587156, "acc_stderr": 0.016909276884936094, "acc_norm": 0.1926605504587156, "acc_norm_stderr": 0.016909276884936094 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1527777777777778, "acc_stderr": 0.024536326026134224, "acc_norm": 0.1527777777777778, "acc_norm_stderr": 0.024536326026134224 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.31390134529147984, "acc_stderr": 0.031146796482972465, "acc_norm": 0.31390134529147984, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267404, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267404 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23754789272030652, "acc_stderr": 0.015218733046150193, "acc_norm": 0.23754789272030652, "acc_norm_stderr": 0.015218733046150193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22549019607843138, "acc_stderr": 0.023929155517351284, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.023929155517351284 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.21604938271604937, "acc_stderr": 0.022899162918445806, "acc_norm": 0.21604938271604937, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23404255319148937, "acc_stderr": 0.025257861359432417, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.025257861359432417 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.023529242185193106, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.023529242185193106 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.18775510204081633, "acc_stderr": 0.02500025603954621, "acc_norm": 0.18775510204081633, "acc_norm_stderr": 0.02500025603954621 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.3561811505507956, "mc1_stderr": 0.016763790728446335, "mc2": 0.5249120169730028, "mc2_stderr": 0.015465385283654536 }, "harness|winogrande|5": { "acc": 0.7805840568271507, "acc_stderr": 0.01163126836060778 }, "harness|gsm8k|5": { "acc": 0.4162244124336619, "acc_stderr": 0.013577788334652672 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_augmxnt__shisa-7b-v1
[ "region:us" ]
2023-12-09T16:07:32+00:00
{"pretty_name": "Evaluation run of augmxnt/shisa-7b-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [augmxnt/shisa-7b-v1](https://huggingface.co/augmxnt/shisa-7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_augmxnt__shisa-7b-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T16:04:45.905043](https://huggingface.co/datasets/open-llm-leaderboard/details_augmxnt__shisa-7b-v1/blob/main/results_2023-12-09T16-04-45.905043.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2538930287407982,\n \"acc_stderr\": 0.030165648669279056,\n \"acc_norm\": 0.2461752509143941,\n \"acc_norm_stderr\": 0.030745578518168466,\n \"mc1\": 0.3561811505507956,\n \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.5249120169730028,\n \"mc2_stderr\": 0.015465385283654536\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5170648464163823,\n \"acc_stderr\": 0.0146028783885366,\n \"acc_norm\": 0.5614334470989761,\n \"acc_norm_stderr\": 0.014500682618212864\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.59699263095001,\n \"acc_stderr\": 0.00489499773671905,\n \"acc_norm\": 0.7862975502887871,\n \"acc_norm_stderr\": 0.00409081394822023\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3561811505507956,\n \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.5249120169730028,\n \"mc2_stderr\": 0.015465385283654536\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4162244124336619,\n \"acc_stderr\": 0.013577788334652672\n }\n}\n```", "repo_url": "https://huggingface.co/augmxnt/shisa-7b-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-04-45.905043.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["**/details_harness|winogrande|5_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T16-04-45.905043.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T16_04_45.905043", "path": ["results_2023-12-09T16-04-45.905043.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T16-04-45.905043.parquet"]}]}]}
2023-12-09T16:08:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of augmxnt/shisa-7b-v1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model augmxnt/shisa-7b-v1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T16:04:45.905043(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of augmxnt/shisa-7b-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model augmxnt/shisa-7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T16:04:45.905043(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of augmxnt/shisa-7b-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model augmxnt/shisa-7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T16:04:45.905043(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of augmxnt/shisa-7b-v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model augmxnt/shisa-7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T16:04:45.905043(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
5fc12d4ba992cdc77c6e0e318a2a1320bb042713
# Dataset Card for Evaluation run of augmxnt/shisa-base-7b-v1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/augmxnt/shisa-base-7b-v1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [augmxnt/shisa-base-7b-v1](https://huggingface.co/augmxnt/shisa-base-7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_augmxnt__shisa-base-7b-v1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T16:05:53.719253](https://huggingface.co/datasets/open-llm-leaderboard/details_augmxnt__shisa-base-7b-v1/blob/main/results_2023-12-09T16-05-53.719253.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2521543269268176, "acc_stderr": 0.0301585291688336, "acc_norm": 0.24535569023474865, "acc_norm_stderr": 0.030748338390153722, "mc1": 0.2729498164014688, "mc1_stderr": 0.015594753632006526, "mc2": 0.4239664190137454, "mc2_stderr": 0.014353789922903714 }, "harness|arc:challenge|25": { "acc": 0.47952218430034127, "acc_stderr": 0.014599131353035004, "acc_norm": 0.523037542662116, "acc_norm_stderr": 0.01459587320535827 }, "harness|hellaswag|10": { "acc": 0.5813582951603267, "acc_stderr": 0.004923281841828519, "acc_norm": 0.7763393746265684, "acc_norm_stderr": 0.004158455808204937 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.18518518518518517, "acc_stderr": 0.03355677216313142, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.03355677216313142 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.02528839450289137, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.02528839450289137 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813365, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813365 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.20899470899470898, "acc_stderr": 0.02094048156533486, "acc_norm": 0.20899470899470898, "acc_norm_stderr": 0.02094048156533486 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04040610178208841, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04040610178208841 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.1774193548387097, "acc_stderr": 0.02173254068932927, "acc_norm": 0.1774193548387097, "acc_norm_stderr": 0.02173254068932927 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.15270935960591134, "acc_stderr": 0.02530890453938063, "acc_norm": 0.15270935960591134, "acc_norm_stderr": 0.02530890453938063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.17676767676767677, "acc_stderr": 0.027178752639044915, "acc_norm": 0.17676767676767677, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860664, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860664 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.20256410256410257, "acc_stderr": 0.020377660970371372, "acc_norm": 0.20256410256410257, "acc_norm_stderr": 0.020377660970371372 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2111111111111111, "acc_stderr": 0.024882116857655075, "acc_norm": 0.2111111111111111, "acc_norm_stderr": 0.024882116857655075 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21008403361344538, "acc_stderr": 0.026461398717471874, "acc_norm": 0.21008403361344538, "acc_norm_stderr": 0.026461398717471874 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.03257847384436776, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.03257847384436776 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1926605504587156, "acc_stderr": 0.016909276884936094, "acc_norm": 0.1926605504587156, "acc_norm_stderr": 0.016909276884936094 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1527777777777778, "acc_stderr": 0.024536326026134224, "acc_norm": 0.1527777777777778, "acc_norm_stderr": 0.024536326026134224 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.31390134529147984, "acc_stderr": 0.031146796482972465, "acc_norm": 0.31390134529147984, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267404, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267404 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23754789272030652, "acc_stderr": 0.015218733046150193, "acc_norm": 0.23754789272030652, "acc_norm_stderr": 0.015218733046150193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22549019607843138, "acc_stderr": 0.023929155517351284, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.023929155517351284 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.21604938271604937, "acc_stderr": 0.022899162918445806, "acc_norm": 0.21604938271604937, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23404255319148937, "acc_stderr": 0.025257861359432417, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.025257861359432417 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.023529242185193106, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.023529242185193106 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.18775510204081633, "acc_stderr": 0.02500025603954621, "acc_norm": 0.18775510204081633, "acc_norm_stderr": 0.02500025603954621 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.2729498164014688, "mc1_stderr": 0.015594753632006526, "mc2": 0.4239664190137454, "mc2_stderr": 0.014353789922903714 }, "harness|winogrande|5": { "acc": 0.7853196527229677, "acc_stderr": 0.011539912734345402 }, "harness|gsm8k|5": { "acc": 0.35860500379075055, "acc_stderr": 0.013210317364134031 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_augmxnt__shisa-base-7b-v1
[ "region:us" ]
2023-12-09T16:08:41+00:00
{"pretty_name": "Evaluation run of augmxnt/shisa-base-7b-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [augmxnt/shisa-base-7b-v1](https://huggingface.co/augmxnt/shisa-base-7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_augmxnt__shisa-base-7b-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T16:05:53.719253](https://huggingface.co/datasets/open-llm-leaderboard/details_augmxnt__shisa-base-7b-v1/blob/main/results_2023-12-09T16-05-53.719253.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2521543269268176,\n \"acc_stderr\": 0.0301585291688336,\n \"acc_norm\": 0.24535569023474865,\n \"acc_norm_stderr\": 0.030748338390153722,\n \"mc1\": 0.2729498164014688,\n \"mc1_stderr\": 0.015594753632006526,\n \"mc2\": 0.4239664190137454,\n \"mc2_stderr\": 0.014353789922903714\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.47952218430034127,\n \"acc_stderr\": 0.014599131353035004,\n \"acc_norm\": 0.523037542662116,\n \"acc_norm_stderr\": 0.01459587320535827\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5813582951603267,\n \"acc_stderr\": 0.004923281841828519,\n \"acc_norm\": 0.7763393746265684,\n \"acc_norm_stderr\": 0.004158455808204937\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2729498164014688,\n \"mc1_stderr\": 0.015594753632006526,\n \"mc2\": 0.4239664190137454,\n \"mc2_stderr\": 0.014353789922903714\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345402\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35860500379075055,\n \"acc_stderr\": 0.013210317364134031\n }\n}\n```", "repo_url": "https://huggingface.co/augmxnt/shisa-base-7b-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-05-53.719253.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["**/details_harness|winogrande|5_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T16-05-53.719253.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T16_05_53.719253", "path": ["results_2023-12-09T16-05-53.719253.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T16-05-53.719253.parquet"]}]}]}
2023-12-09T16:09:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of augmxnt/shisa-base-7b-v1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model augmxnt/shisa-base-7b-v1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T16:05:53.719253(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of augmxnt/shisa-base-7b-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model augmxnt/shisa-base-7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T16:05:53.719253(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of augmxnt/shisa-base-7b-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model augmxnt/shisa-base-7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T16:05:53.719253(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of augmxnt/shisa-base-7b-v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model augmxnt/shisa-base-7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T16:05:53.719253(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
bbd24fb6d31c80ef0cedf61882c696759a056b19
# Medici Animation Instruct Dataset ### Small instruct dataset for animation generation with ManimCE
mediciresearch/manimation
[ "task_categories:text-generation", "size_categories:1K<n<10K", "license:mit", "code", "math", "sciences", "animation", "region:us" ]
2023-12-09T16:13:09+00:00
{"license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"], "pretty_name": "medici-anim", "tags": ["code", "math", "sciences", "animation"]}
2023-12-09T16:32:18+00:00
[]
[]
TAGS #task_categories-text-generation #size_categories-1K<n<10K #license-mit #code #math #sciences #animation #region-us
# Medici Animation Instruct Dataset ### Small instruct dataset for animation generation with ManimCE
[ "# Medici Animation Instruct Dataset", "### Small instruct dataset for animation generation with ManimCE" ]
[ "TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #license-mit #code #math #sciences #animation #region-us \n", "# Medici Animation Instruct Dataset", "### Small instruct dataset for animation generation with ManimCE" ]
[ 43, 8, 15 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #license-mit #code #math #sciences #animation #region-us \n# Medici Animation Instruct Dataset### Small instruct dataset for animation generation with ManimCE" ]
127b2ab5440b2efc775723ef84d116ad250ee6f4
# Dataset Card for Evaluation run of Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties](https://huggingface.co/Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Weyaxi__MetaMath-una-cybertron-v2-bf16-Ties", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T16:17:55.430276](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-una-cybertron-v2-bf16-Ties/blob/main/results_2023-12-09T16-17-55.430276.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6301095117404198, "acc_stderr": 0.03256047711626452, "acc_norm": 0.629798681344077, "acc_norm_stderr": 0.03322922327648027, "mc1": 0.3990208078335373, "mc1_stderr": 0.017142825728496767, "mc2": 0.555200832195947, "mc2_stderr": 0.01590653396629896 }, "harness|arc:challenge|25": { "acc": 0.6305460750853242, "acc_stderr": 0.014104578366491887, "acc_norm": 0.6501706484641638, "acc_norm_stderr": 0.013936809212158296 }, "harness|hellaswag|10": { "acc": 0.6700856403106951, "acc_stderr": 0.0046922082796905925, "acc_norm": 0.8367855008962358, "acc_norm_stderr": 0.0036880598312390156 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6222222222222222, "acc_stderr": 0.04188307537595852, "acc_norm": 0.6222222222222222, "acc_norm_stderr": 0.04188307537595852 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6792452830188679, "acc_stderr": 0.028727502957880267, "acc_norm": 0.6792452830188679, "acc_norm_stderr": 0.028727502957880267 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7083333333333334, "acc_stderr": 0.038009680605548594, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.038009680605548594 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.630057803468208, "acc_stderr": 0.0368122963339432, "acc_norm": 0.630057803468208, "acc_norm_stderr": 0.0368122963339432 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.048108401480826346, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.048108401480826346 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932261, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932261 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224469, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224469 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5241379310344828, "acc_stderr": 0.0416180850350153, "acc_norm": 0.5241379310344828, "acc_norm_stderr": 0.0416180850350153 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3968253968253968, "acc_stderr": 0.025197101074246477, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.025197101074246477 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768176, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768176 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.04793724854411019, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7516129032258064, "acc_stderr": 0.024580028921481003, "acc_norm": 0.7516129032258064, "acc_norm_stderr": 0.024580028921481003 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.47783251231527096, "acc_stderr": 0.03514528562175008, "acc_norm": 0.47783251231527096, "acc_norm_stderr": 0.03514528562175008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7676767676767676, "acc_stderr": 0.030088629490217487, "acc_norm": 0.7676767676767676, "acc_norm_stderr": 0.030088629490217487 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8860103626943006, "acc_stderr": 0.022935144053919443, "acc_norm": 0.8860103626943006, "acc_norm_stderr": 0.022935144053919443 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6461538461538462, "acc_stderr": 0.024243783994062153, "acc_norm": 0.6461538461538462, "acc_norm_stderr": 0.024243783994062153 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028593, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6932773109243697, "acc_stderr": 0.02995382389188704, "acc_norm": 0.6932773109243697, "acc_norm_stderr": 0.02995382389188704 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526733, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526733 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8385321100917431, "acc_stderr": 0.015776239256163224, "acc_norm": 0.8385321100917431, "acc_norm_stderr": 0.015776239256163224 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5555555555555556, "acc_stderr": 0.03388857118502325, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.03388857118502325 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7794117647058824, "acc_stderr": 0.02910225438967408, "acc_norm": 0.7794117647058824, "acc_norm_stderr": 0.02910225438967408 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7721518987341772, "acc_stderr": 0.027303484599069436, "acc_norm": 0.7721518987341772, "acc_norm_stderr": 0.027303484599069436 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7099236641221374, "acc_stderr": 0.03980066246467766, "acc_norm": 0.7099236641221374, "acc_norm_stderr": 0.03980066246467766 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7055214723926381, "acc_stderr": 0.03581165790474082, "acc_norm": 0.7055214723926381, "acc_norm_stderr": 0.03581165790474082 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822584, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822584 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8084291187739464, "acc_stderr": 0.014072859310451949, "acc_norm": 0.8084291187739464, "acc_norm_stderr": 0.014072859310451949 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7023121387283237, "acc_stderr": 0.024617055388677006, "acc_norm": 0.7023121387283237, "acc_norm_stderr": 0.024617055388677006 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.42793296089385474, "acc_stderr": 0.01654788799741611, "acc_norm": 0.42793296089385474, "acc_norm_stderr": 0.01654788799741611 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6797385620915033, "acc_stderr": 0.02671611838015685, "acc_norm": 0.6797385620915033, "acc_norm_stderr": 0.02671611838015685 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632952, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632952 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7129629629629629, "acc_stderr": 0.025171041915309684, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.025171041915309684 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.44680851063829785, "acc_stderr": 0.029658235097666907, "acc_norm": 0.44680851063829785, "acc_norm_stderr": 0.029658235097666907 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.455019556714472, "acc_stderr": 0.012718456618701772, "acc_norm": 0.455019556714472, "acc_norm_stderr": 0.012718456618701772 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6397058823529411, "acc_stderr": 0.02916312857067073, "acc_norm": 0.6397058823529411, "acc_norm_stderr": 0.02916312857067073 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6470588235294118, "acc_stderr": 0.01933314202079716, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.01933314202079716 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6181818181818182, "acc_stderr": 0.046534298079135075, "acc_norm": 0.6181818181818182, "acc_norm_stderr": 0.046534298079135075 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.028666857790274648, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.028666857790274648 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8059701492537313, "acc_stderr": 0.027962677604768907, "acc_norm": 0.8059701492537313, "acc_norm_stderr": 0.027962677604768907 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197771, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197771 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.029547741687640044, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.029547741687640044 }, "harness|truthfulqa:mc|0": { "mc1": 0.3990208078335373, "mc1_stderr": 0.017142825728496767, "mc2": 0.555200832195947, "mc2_stderr": 0.01590653396629896 }, "harness|winogrande|5": { "acc": 0.7726913970007893, "acc_stderr": 0.011778612167091088 }, "harness|gsm8k|5": { "acc": 0.6921910538286581, "acc_stderr": 0.012714401009923647 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Weyaxi__MetaMath-una-cybertron-v2-bf16-Ties
[ "region:us" ]
2023-12-09T16:20:46+00:00
{"pretty_name": "Evaluation run of Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties](https://huggingface.co/Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__MetaMath-una-cybertron-v2-bf16-Ties\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T16:17:55.430276](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-una-cybertron-v2-bf16-Ties/blob/main/results_2023-12-09T16-17-55.430276.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6301095117404198,\n \"acc_stderr\": 0.03256047711626452,\n \"acc_norm\": 0.629798681344077,\n \"acc_norm_stderr\": 0.03322922327648027,\n \"mc1\": 0.3990208078335373,\n \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.555200832195947,\n \"mc2_stderr\": 0.01590653396629896\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.014104578366491887,\n \"acc_norm\": 0.6501706484641638,\n \"acc_norm_stderr\": 0.013936809212158296\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6700856403106951,\n \"acc_stderr\": 0.0046922082796905925,\n \"acc_norm\": 0.8367855008962358,\n \"acc_norm_stderr\": 0.0036880598312390156\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224469,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224469\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246477,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246477\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062153,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062153\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163224,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163224\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069436,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677006,\n \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677006\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632952,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632952\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.025171041915309684,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.025171041915309684\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.455019556714472,\n \"acc_stderr\": 0.012718456618701772,\n \"acc_norm\": 0.455019556714472,\n \"acc_norm_stderr\": 0.012718456618701772\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.01933314202079716,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.01933314202079716\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768907,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768907\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3990208078335373,\n \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.555200832195947,\n \"mc2_stderr\": 0.01590653396629896\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091088\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \"acc_stderr\": 0.012714401009923647\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-17-55.430276.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["**/details_harness|winogrande|5_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T16-17-55.430276.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T16_17_55.430276", "path": ["results_2023-12-09T16-17-55.430276.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T16-17-55.430276.parquet"]}]}]}
2023-12-09T16:21:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-09T16:17:55.430276(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T16:17:55.430276(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-09T16:17:55.430276(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 179, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Weyaxi/MetaMath-una-cybertron-v2-bf16-Ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-09T16:17:55.430276(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7c0d9a1e0583557afb5c1de41aba3f36be1cb19b
# Dataset Card for Merged Remote Landscapes dataset [![version](https://img.shields.io/badge/version-0.0.1-orange.svg)]() ## Dataset summary This is a merged version of following datasets: * [torchgeo/ucmerced](https://huggingface.co/datasets/torchgeo/ucmerced) * [NWPU-RESISC45](https://huggingface.co/datasets/jonathan-roberts1/NWPU-RESISC45) ```python from datasets import load_dataset dataset = load_dataset('EmbeddingStudio/merged_remote_landscapes_v1') ``` ### Categories This is a union of categories from original datasets: agricultural, airplane, airport, baseball diamond, basketball court, beach, bridge, buildings, chaparral, church, circular farmland, cloud, commercial area, desert, forest, freeway, golf course, ground track field, harbor, industrial area, intersection, island, lake, meadow, mountain, overpass, palace, parking lot, railway, railway station, rectangular farmland, residential, river, roundabout, runway, sea ice, ship, snowberg, stadium, storage tanks, tennis court, terrace, thermal power station, wetland Warning: Synonymous and ambiguous categories were combined (see "Merge method"). ## Motivation EmbeddingStudio is the open-source framework, that allows you transform a joint "Embedding Model + Vector DB" into a full-cycle search engine: collect clickstream -> improve search experience-> adapt embedding model and repeat out of the box. In the development of EmbeddingStudio the scientific approach is a backbone. On the early stage of the development we can't collect real clickstream data, so to do experiments and choose the best way to improve embedding model we had to use synthetic or emulated data. And the first step is to use the most transparent datasets and the easiest domain. P.S. this dataset is tagged to be used for the image classification task, but in fact we use it for the metric learning task. And we do another step to emulate clickstream. We provide this dataset on HuggingFace, so anyone can reproduce our results. Check our repositories to get more details: * EmbeddingStudio Framework (coming soon at 22.12.2023) * Experiments (coming soon) ## Merge method For this type of dataset it's all simple: 1. Remove duplicates. 2. Resolve synonymous and ambiguous categories with using a simple map (CATEGORIES_MAPPING). ```python CATEGORIES_MAPPING = { "dense residential": "residential", "medium residential": "residential", "mobile home park": "residential", "sparse residential": "residential", "storage tank": "storage tanks", "storage tanks": "storage tanks", } ``` All details and code base of merging algorithm will be provided in our experiments repository. If you have any suggestion or you find some mistakes, we will be happy to fix it, so our experimental data will have better quality. ## Contact info * Alexander Yudaev [email]([email protected] ) [LikedIn](https://www.linkedin.com/in/alexanderyudaev/)
EmbeddingStudio/merged_remote_landscapes_v1
[ "task_categories:image-classification", "size_categories:10K<n<100K", "license:apache-2.0", "landscapes", "geo", "remote photos", "metric learning", "region:us" ]
2023-12-09T16:26:43+00:00
{"license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["image-classification"], "pretty_name": "Merged Remote Landscapes v1.0.0", "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "category", "dtype": "string"}, {"name": "img_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 687610836.528, "num_examples": 26872}, {"name": "test", "num_bytes": 178694171.287, "num_examples": 6719}], "download_size": 843239857, "dataset_size": 866305007.815}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "tags": ["landscapes", "geo", "remote photos", "metric learning"]}
2023-12-10T09:14:25+00:00
[]
[]
TAGS #task_categories-image-classification #size_categories-10K<n<100K #license-apache-2.0 #landscapes #geo #remote photos #metric learning #region-us
# Dataset Card for Merged Remote Landscapes dataset ![version]() ## Dataset summary This is a merged version of following datasets: * torchgeo/ucmerced * NWPU-RESISC45 ### Categories This is a union of categories from original datasets: agricultural, airplane, airport, baseball diamond, basketball court, beach, bridge, buildings, chaparral, church, circular farmland, cloud, commercial area, desert, forest, freeway, golf course, ground track field, harbor, industrial area, intersection, island, lake, meadow, mountain, overpass, palace, parking lot, railway, railway station, rectangular farmland, residential, river, roundabout, runway, sea ice, ship, snowberg, stadium, storage tanks, tennis court, terrace, thermal power station, wetland Warning: Synonymous and ambiguous categories were combined (see "Merge method"). ## Motivation EmbeddingStudio is the open-source framework, that allows you transform a joint "Embedding Model + Vector DB" into a full-cycle search engine: collect clickstream -> improve search experience-> adapt embedding model and repeat out of the box. In the development of EmbeddingStudio the scientific approach is a backbone. On the early stage of the development we can't collect real clickstream data, so to do experiments and choose the best way to improve embedding model we had to use synthetic or emulated data. And the first step is to use the most transparent datasets and the easiest domain. P.S. this dataset is tagged to be used for the image classification task, but in fact we use it for the metric learning task. And we do another step to emulate clickstream. We provide this dataset on HuggingFace, so anyone can reproduce our results. Check our repositories to get more details: * EmbeddingStudio Framework (coming soon at 22.12.2023) * Experiments (coming soon) ## Merge method For this type of dataset it's all simple: 1. Remove duplicates. 2. Resolve synonymous and ambiguous categories with using a simple map (CATEGORIES_MAPPING). All details and code base of merging algorithm will be provided in our experiments repository. If you have any suggestion or you find some mistakes, we will be happy to fix it, so our experimental data will have better quality. ## Contact info * Alexander Yudaev email LikedIn
[ "# Dataset Card for Merged Remote Landscapes dataset\n\n![version]()", "## Dataset summary\n\nThis is a merged version of following datasets:\n* torchgeo/ucmerced\n* NWPU-RESISC45", "### Categories\n\nThis is a union of categories from original datasets:\nagricultural, airplane, airport, baseball diamond, basketball court, beach, bridge, buildings, chaparral, church, circular farmland, cloud, commercial area, desert, forest, freeway, golf course, ground track field, harbor, industrial area, intersection, island, lake, meadow, mountain, overpass, palace, parking lot, railway, railway station, rectangular farmland, residential, river, roundabout, runway, sea ice, ship, snowberg, stadium, storage tanks, tennis court, terrace, thermal power station, wetland\n\nWarning: Synonymous and ambiguous categories were combined (see \"Merge method\").", "## Motivation\n\nEmbeddingStudio is the open-source framework, that allows you transform a joint \"Embedding Model + Vector DB\" into a full-cycle search engine: collect clickstream -> improve search experience-> adapt embedding model and repeat out of the box.\n\nIn the development of EmbeddingStudio the scientific approach is a backbone. On the early stage of the development we can't collect real clickstream data, so to do experiments and choose the best way to improve embedding model we had to use synthetic or emulated data. And the first step is to use the most transparent datasets and the easiest domain.\n\nP.S. this dataset is tagged to be used for the image classification task, but in fact we use it for the metric learning task. And we do another step to emulate clickstream.\n\nWe provide this dataset on HuggingFace, so anyone can reproduce our results.\n\nCheck our repositories to get more details:\n* EmbeddingStudio Framework (coming soon at 22.12.2023)\n* Experiments (coming soon)", "## Merge method\n\nFor this type of dataset it's all simple:\n1. Remove duplicates.\n2. Resolve synonymous and ambiguous categories with using a simple map (CATEGORIES_MAPPING).\n\n\n\nAll details and code base of merging algorithm will be provided in our experiments repository. If you have any suggestion or you find some mistakes, we will be happy to fix it, so our experimental data will have better quality.", "## Contact info\n\n* Alexander Yudaev email LikedIn" ]
[ "TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #license-apache-2.0 #landscapes #geo #remote photos #metric learning #region-us \n", "# Dataset Card for Merged Remote Landscapes dataset\n\n![version]()", "## Dataset summary\n\nThis is a merged version of following datasets:\n* torchgeo/ucmerced\n* NWPU-RESISC45", "### Categories\n\nThis is a union of categories from original datasets:\nagricultural, airplane, airport, baseball diamond, basketball court, beach, bridge, buildings, chaparral, church, circular farmland, cloud, commercial area, desert, forest, freeway, golf course, ground track field, harbor, industrial area, intersection, island, lake, meadow, mountain, overpass, palace, parking lot, railway, railway station, rectangular farmland, residential, river, roundabout, runway, sea ice, ship, snowberg, stadium, storage tanks, tennis court, terrace, thermal power station, wetland\n\nWarning: Synonymous and ambiguous categories were combined (see \"Merge method\").", "## Motivation\n\nEmbeddingStudio is the open-source framework, that allows you transform a joint \"Embedding Model + Vector DB\" into a full-cycle search engine: collect clickstream -> improve search experience-> adapt embedding model and repeat out of the box.\n\nIn the development of EmbeddingStudio the scientific approach is a backbone. On the early stage of the development we can't collect real clickstream data, so to do experiments and choose the best way to improve embedding model we had to use synthetic or emulated data. And the first step is to use the most transparent datasets and the easiest domain.\n\nP.S. this dataset is tagged to be used for the image classification task, but in fact we use it for the metric learning task. And we do another step to emulate clickstream.\n\nWe provide this dataset on HuggingFace, so anyone can reproduce our results.\n\nCheck our repositories to get more details:\n* EmbeddingStudio Framework (coming soon at 22.12.2023)\n* Experiments (coming soon)", "## Merge method\n\nFor this type of dataset it's all simple:\n1. Remove duplicates.\n2. Resolve synonymous and ambiguous categories with using a simple map (CATEGORIES_MAPPING).\n\n\n\nAll details and code base of merging algorithm will be provided in our experiments repository. If you have any suggestion or you find some mistakes, we will be happy to fix it, so our experimental data will have better quality.", "## Contact info\n\n* Alexander Yudaev email LikedIn" ]
[ 51, 19, 34, 176, 238, 95, 12 ]
[ "passage: TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #license-apache-2.0 #landscapes #geo #remote photos #metric learning #region-us \n# Dataset Card for Merged Remote Landscapes dataset\n\n![version]()## Dataset summary\n\nThis is a merged version of following datasets:\n* torchgeo/ucmerced\n* NWPU-RESISC45### Categories\n\nThis is a union of categories from original datasets:\nagricultural, airplane, airport, baseball diamond, basketball court, beach, bridge, buildings, chaparral, church, circular farmland, cloud, commercial area, desert, forest, freeway, golf course, ground track field, harbor, industrial area, intersection, island, lake, meadow, mountain, overpass, palace, parking lot, railway, railway station, rectangular farmland, residential, river, roundabout, runway, sea ice, ship, snowberg, stadium, storage tanks, tennis court, terrace, thermal power station, wetland\n\nWarning: Synonymous and ambiguous categories were combined (see \"Merge method\")." ]